Try doing LLM inference in python and you'll eventually understand after first learning to use venv (or some other dependency manager manager) then picking pip or conda or anaconda or something else as your dependency manager, then trying to get the actual pytorch/hf/etc package dependencies mutually fulfilled. Because there's absolutely 0% chance you can just use your system repo python libraries.
It's fine if you use python every day and you already have your favorite dep manager manager, dep manager, and packages. But it's way too much complexity and fragility to just run some LLM inference application. Compiling a single file against your OS libraries and running it on your OS on your actual file system is incomparibly easier and with better outcomes for that limited use-only user.
Yeah Python is a disaster for dependency management. Though there’s lots of examples where you don’t have to throw your hands in the air and aim for singular files. Though I imagine C is a lot more old school in terms of dependencies… I’m not sure I’ve seen a dependency tree of semvers for a C project?
It's just up to you, the author of the project. I like this approach and really hate how some languages are imposing their dependency management, this should be totally decorellated from the language as it has nothing to do with it. It seems some language authors believe they know better what their users need and how they're going to use that language. It makes no sense. Also many of them seem to have never heard about cross-compiling!
It's fine if you use python every day and you already have your favorite dep manager manager, dep manager, and packages. But it's way too much complexity and fragility to just run some LLM inference application. Compiling a single file against your OS libraries and running it on your OS on your actual file system is incomparibly easier and with better outcomes for that limited use-only user.