Hacker Newsnew | past | comments | ask | show | jobs | submit | antononcube's commentslogin

There is a set of Raku modules that leverage LLMs for different tasks (mostly code generation) using different techniques:

- https://raku.land/zef:antononcube/LLM::Resources : Uses agentic LLM-graphs with asynchronous execution

- https://raku.land/zef:antononcube/ML::FindTextualAnswer : Finds answers to questions over provided texts (e.g. natural language code generation commands)

- https://raku.land/zef:antononcube/ML::NLPTemplateEngine : Fills-in predefined code templates based on natural language code descriptions/commands

- https://raku.land/zef:antononcube/DSL::Examples : Example translations of natural language commands to executable code


I've got a few LLM modules too, mostly for handling context management:

- https://raku.land/zef:apogee/LLM::Character implements CCv3 which is a standard for managing characters (system prompts) and lorebooks (injected snippets)

- https://raku.land/zef:apogee/LLM::Chat handles context shifting for long contexts, sampler settings, templating for text completion & inferencing with or without streaming using supply/tap

- https://raku.land/zef:apogee/LLM::Data::Inference adds retries, JSON parsing & multi-model route handling to LLM::Chat

- https://raku.land/zef:apogee/LLM::Data::Pipeline allows you to declaratively build multi-step pipelines (simple agentic LLM use)

- https://raku.land/zef:apogee/HuggingFace::API is a partial wrapper around HF API for grabbing tokenizers.json & tokenizer_config.json

- https://raku.land/zef:apogee/Template::Jinja2 is a near-complete impl of Jinja2 for parsing LLM text completion templates (can be used for anything you'd use Jinja2 for)

- https://raku.land/zef:apogee/Tokenizers is a thin wrapper around HF tokenizers, for token counting mostly


That Python package, "NLPTemplateEngine" has Raku and Wolfram Language counterparts:

- Raku, "ML::NLPTemplateEngine"

  - https://raku.land/zef:antononcube/ML::NLPTemplateEngine
- Wolfram Language, "NLPTemplateEngine"

  - https://resources.wolframcloud.com/PacletRepository/resources/AntonAntonov/NLPTemplateEngine/


Related Number Theory notebooks / discussions:

- «Numerically 2026 is unremarkable yet happy: semiprime with primitive roots» https://community.wolfram.com/groups/-/m/t/3594686

- «Happy √2²²-22 -- And other ways to calculate 2026» https://community.wolfram.com/groups/-/m/t/3599161


The integer 2026 is semiprime and a happy number, with 365 as one of its primitive roots. Although 2026 may not be particularly noteworthy in number theory, this provides a great excuse to create various elaborate visualizations that reveal some interesting aspects of the number.


Interesting variant. I might program it for some of the «Rock-Paper-Scissors extensions» here:

https://rakuforprediction.wordpress.com/2025/03/03/rock-pape...

Some of the extensions would need polyhedral dices:

https://demonstrations.wolfram.com/OpenDiceRolls/


This document (notebook) shows transformations of a movie dataset into a format more suitable for data analysis and for making a movie recommender system. It is the first of a three-part series of notebooks that showcase Raku packages for doing Data Science (DS).


Yes, Wolfram Language (WL) -- aka Mathematica -- introduced `Tabular` in 2025. It is a new data structure with a constellation of related functions (like `ToTabular`, `PivotToColumns`, etc.) Using it is 10÷100 times faster than using WL's older `Dataset` structure. (In my experience. With both didactic and real life data of 1_000÷100_000 rows and 10÷100 columns.)


This blog post (and related notebook) show how to utilize Large Language Model (LLM) Function Calling with the Raku package "LLM::Functions".

- Package: https://raku.land/zef:antononcube/LLM::Functions

- Notebook: https://github.com/antononcube/RakuForPrediction-blog/blob/m...


Mostly, because Python is not a good a "discovery" and prototyping language. It is like that by design -- Guido Van Rossum decided that TMTOWTDI is counter-productive.

Another point, which could have mentioned in my previous response -- Raku has more elegant and easy to use asynchronous computations framework.

IMO, Python's introspection matches that Raku's introspection.

Some argue that Python's LLM packages are more and better than Raku's. I agree on the "more" part. I am not sure about the "better" part:

- Generally speaking, different people prefer decomposing computations in a different way. - When few years ago I re-implemented Raku's LLM packages in Python, Python did not have equally convenient packages.


Ah, yes, Raku's "LLM::Graph" is heavily inspired by the design of the function LLMGraph of Wolfram Language (aka Mathematica.)

WL's LLMGraph is more developed and productized, but Raku's "LLM::Graph" is catching up.

I would like to say that "LLM::Graph" was relatively easy to program because of Raku's introspection, wrappers, asynchronous features, and pre-existing LLM functionalities packages. As a consequence the code of "LLM::Graph" is short.

Wolfram Language does not have that level introspection, but otherwise is likely a better choice mostly for its far greater scope of functionalities. (Mathematics, graphics, computable data, etc.)

In principle a corresponding Python "LLMGraph" package can be developed, for comparison purposes. Then the "better choice" question can be answered in a more informed manner. (The Raku packages "LLM::Functions" and "LLM::Prompts" have their corresponding Python packages implemented already.)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: