Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's my experience. I'm not a Python developer, and installing Python programs has been a mess for decades, so I'd rather stay away from the language than try another new tool.

Over the years, I've used setup.py, pip, pipenv (which kept crashing though it was an official recommendation), manual venv+pip (or virtualenv? I vaguely remember there were 2 similar tools and none was part of a minimal Python install). Does uv work in all of these cases? The uv doc pointed out by the GP is vague about legacy projects, though I've just skimmed through the long page.

IIRC, Python tools didn't share their data across projects, so they could build the same heavy dependencies multiple times. I've also seen projects with incomplete dependencies (installed through Conda, IIRC) which were a major pain to get working. For many years, the only simple and sane way to run some Python code was in a Docker image, which has its own drawbacks.





> Does uv work in all of these cases?

Yes. The goal of uv is to defuck the python ecosystem and they're doing a very good job at it so far.


What are the big offenders right now? What does uv unfuck?

I only work a little bit with python.


In my experience every other python tool has a variety of slightly to extremely painful behaviours that you have to work around or at least be aware of.

Sometimes it's things like updating to Fedora 43 and every tool you installed with `pipx` breaking because it was doing things that got wiped out by the system upgrade, sometimes it's `poetry update --only dep1` silently updating dep2 in the background without telling you because there was an update available and even though you specified `--only` you were wrong to do that and Poetry knows best.

Did you know that when you call `python -m venv` you should always pass `--upgrade-deps` because otherwise it intentionally installs an out of date version of pip and setuptools as a joke? Maybe you're not using `python -m venv` because you ran the pyenv installer and it automatically installed `pyenv-virtualenv` without asking which overrides a bunch of virtualenv features because the pyenv team think you should develop things in the same way they do regardless of how you want to delevop things. I hate pyenv.

So far the only problem I've had with uv is that if you run `uv venv` it doesn't install pip in the created virtualenv because you're supposed to run `uv pip install` instead of `pip install`. That's annoying but it's not a dealbreaker.

Outside of that, I feel very confident that I could give a link to the uv docs to a junior developer and tell them to run `uv python install 3.13` and `uv tool install ruff` and then run `uv sync` in a project and everything will work out and I'm not going to have to help them recover their hard drive because they made the foolish mistake of assuming that `brew install python` wouldn't wreck their macbook when the next version of Python gets released.


uv not only completely replaces all of pip, pyenv & venv, but it also does a much better job than any of them at their intended function, as well as a bunch of other convenient, simple developer-friendly features.

1. pip isn't entirely to blame for all of Python's bad package management - distutils & setuptools gave us setup.py shenanigans - but either way, UV does away with that in favour of a modern, consistent, declarative, parseable PEP 508 manifest spec, along with their own well-designed lockfile (there was no accepted lockfile PEP at the time UV was created - since PEP 715 has become accepted UV has added support, though that PEP is still limited so there's more work to do here).

2. pyenv works fine but uv is faster & adds some nice extra features with uvx

3. venv has always been a pain - ensuring you're always in the right venv, shell support, etc. uv handles this invisibly & automatically - because it's one tool you don't need to worry about running pip in the right venv or whatever.


pip and venv. The Python ecosystem has taken a huge step backwards with the preachy attitude that you have to do everything in a venv. Not when I want to have installable utility scripts usable from all my shells at any time or location.

I get that installing to the site-packages is a security vulnerability. Installing to my home directory is not, so why can't that be the happy path by default? Debian used to make this easy with the dist-packages split leaving site-packages as a safe sandbox but they caved.


Regarding why not your home directory: which version of Foo do you install, the one that Project A needs or the incompatible one that Project B needs?

The brilliant part about venvs is that A and B can have their completely separate mutually incompatible environments.


They have their place. But the default shouldn't force you into a "project" when you want general purpose applicability. Python should work from the shell as readily as it did 20 years ago. Not mysteriously break what used to work with no low-friction replacement.

It does work from the shell.

Python can work from the shell, if you don’t have external dependencies. But once you have external dependencies, with incompatible potential versions, I just don’t see how you could do this with “one environment”.

Why can't we just have something like npm/gradle/maven dependencies? What makes python any different?

A python virtualenv is just a slightly more complicated node_modules. Tools like PDM, Poetry and uv handle them automatically for you to the point where it effectively is the same as npm.

The thing that makes Python different is that it was never designed with any kind of per-project isolation in mind and this is the best way anyone's come up with to hack that behaviour into the language.


For years, pipx did almost all the work that I needed it to do for safely running utility scripts.

uv has replaced that for me, and has replaced most other tools that I used with the (tiny amount of) Python that I write for production.


> Not when I want to have installable utility scripts usable from all my shells at any time or location.

Can't you just have the thing on your PATH be a wrapper that invokes the tool via its venv?


That's what `uv tool install` does: it creates the wrapper and puts a symlink to it into ~/.local/bin (which you can add to PATH with `uv tool update-shell` if you don't want to do it manually). I don't recall pip doing anything helpful here; I think it still leaves it up to the end user to either add the venv's bin directory to their PATH or create the wrapper and put it somewhere already on the PATH. So it's a reasonable complaint that `pip install` has become less useful now that it resists installing tools outside of a venv but still lacks the replacement feature (which third party tools like uv and pipx do provide).

It unfucks nothing because it wasn't fuckd in the first place. Whole uv is solution to non existing problem.

That's giving way too much credit to uv.

I'm interpreting this as "uv was built off of years of PEPs", which is true; that being said the UX of `uv` is their own, and to me has significantly reduced the amount of time I spend thinking about requirements, modules, etc.

uv is really that good.

If so, ok, let's port this prototype to back to python and get rid of uv.

What does this comment mean? Port the dependency and virtual environment manager back to the language?

Should we port npm “back” to node js?


Well, go does have the module management, including downloading new versions of itself, built-in into the `go` tool itself. It is really great.

But I don't see this hapenning in python.


You don't see that happening because you don't want to.

npm is written in javascript, not rust or c#.

yes, we should bring package manager back. if it is so awesome and solves some problem.


Sounds good, I agree that uv should come with the language in the same way npm comes with node and cargo comes with rust.

You keep using words like "we" and "us" so I assume you'll be kicking off writing the PEP to make this happen?


They've definitely not done it yet, but they're getting there.

It really isnt

> IIRC, Python tools didn't share their data across projects, so they could build the same heavy dependencies multiple times.

One of the neatest features of uv is that it uses clever symlinking tricks so if you have a dozen different Python environments all with the same dependency there's only one copy of that dependency on disk.


Hard links, in fact. It's not hard to do, just (the Rust equivalent of) `os.link` in place of `os.copy` pretty much. The actually clever part is that the package cache actually contains files that can be used this way, instead of just having wheels and unpacking them from scratch each time.

For pip to do this, first it would have to organize its cache in a sensible manner, such that it could work as an actual download cache. Currently it is an HTTP cache (except for locally-built wheels), where it uses a vendored third-party library to simulate the connection to files.pythonhosted.org (in the common PyPI case). But it still needs to connect to pypi.org to figure out the URI that the third-party library will simulate accessing.


I would not be putting up with Python if not for uv. It’s that good.

Before uv came along I was starting to write stuff in Go that I’d normally write in Python.


Coming from a mostly Java guy (since around 2001), I've been away from Python for a while and my two most recent work projects have been in Python and both switched to uv around the time I joined. Such a huge difference in time and pain - I'm with you here.

Python's always been a pretty nice language to work in, and uv makes it one of the most pleasant to deal with.


I don't even like Python as a language (it's growing on me, but only a little).

It's just so useful: uv is great and there are decent quality packages for everything imaginable.


That's partly because python has a very large installed base, and ease of entry (including distribution). This leads to people running into issues quicker, and many alternative solutions.

Unlike something like Rust, which has much fewer users (though growing) and requires PhDs in Compiler Imprecation and Lexical Exegetics.

Or C++ which has a much larger installed base but also no standard distribution method at all, and an honorary degree in Dorsal Artillery.


uv solved it, it’s safe to come back now.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: