Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have to say I'm rather more worried about the apparent lack of testing that their auto-update mechanism is actually updating anything (given how long it took them to notice that symptom), than that they're replacing some software with a not yet quite complete rewrite in their less-stable non-lts edition.


There is no such thing as a less-stable-non-lts edition. That’s the stable version. The LTS version is just a stable version which is getting updated for longer. Non LTS absolutely shouldn’t mean unstable.


It seems less stable in the sense that

1. It literally remains stable for less time. Nine months instead of 5+ years, up to 12 if you pay them.

2. They apparently have a history of testing changes in it.

3. They appear to only sell things like livepatch and extended support for LTS editions, and products you pay for are implicitly more stable than products you do not.


Historically also, they've pushed things out from a LTS release that could have gone in and made people wait for the next non-LTS release because they were too new or experimental. If it's good, it'll be in the next LTS, but if not, it won't and can be removed from the next non-LTS without impacting too much.

Or to use Ubuntu's own terminology: "Interim releases will introduce new capabilities from Canonical and upstream open source projects, they serve as a proving ground for these new capabilities." They also call LTS 'enterprise grade' while interims are merely production-quality. Personally I see these as different levels of stability.


> It literally remains stable for less time. Nine months instead of 5+ years, up to 12 if you pay them.

Isn't "stability" in this context a direct reference to feature set which stays stable? When a version is designated stable it stays stable. You're talking about support which can be longer or shorter regardless of feature set.

When they stop adding features, it's stable. Every old xx.04 and xx.10 version of Ubuntu is stable even today, no more features getting added to 12.10. When they stop offering support, it's unsupported. 14.04 LTS became unsupported last year but not less stable.

These are orthogonal. You can offer long term support for any possible feature combination (if you have the resources), and you can be stable with no support. In reality it's easier to freeze a feature set and support that snapshot for a long time then chase a moving target.


I can see where you're coming from, but I think I'd prefer to describe practically all stable software as living in an unstable equilibrium in the usable region of state-space. When the stabilizing force of security patches, certificate updates, updates to new hardware requirements, and so on and so forth disappears the software falls out of the usable region of space into the, I suppose stable equilibrium, of unusable software. And this fall happens quite rapidly in the case of a linux distribution.

Applying the word "stable" to things in the unusable region of state space seems technically, but only technically, correct.


Not meant as a jab at Ubuntu, but I don't think people choose Ubuntu for engineering rigor. If you want something which is dull, predictable and known for their rigor OpenBSD, illumos, FreeBSD, etc. seem like more likely choices.


Or Debian and Redhat, which have the added bonus of being "boring technology."

If you have a problem with them, 20 other people have had that same problem before you did, two of them have posted on Stackoverflow and one wrote a blog post.

OpenBSD and Illumos may be cool, but you really need to know what you're doing to use them.


For me, it's been more about the online help suggestions you're most likely to find an Ubuntu centric answer when you have issues. Of course you also have to consider the date of a Q/A and the version in question. Since perma-switching my desktop in the past few years, I've mostly used Pop, because I like most of their UI changes, including Cosmos, despite a handful of now mostly corrected issues... They tend to push features and kernel versions ahead of Ubuntu LTS.

That said, the underlying structure is still Ubuntu centered. I also like Ubuntu server, even through I don't use snaps, mostly because the install pre-configures most of the initial changes I make to debian anyway. Sudo is configured, you get an option to import your public key and preconfigure non-pwd ssh, etc. I mostly install ufw and Docker and almost everything I run goes under Docker in practice.


Historically, Ubuntu was a good choice if you were releasing a licensed OS, with minimal customization, that needed CUDA more than, say, Vixie cron.


Officially you are right, they release it as a stable OS after a few weeks of beta's.

Unofficially any serious user knows to stick to LTS for any production environment. This is by far the most common versions I encounter in the wild and on customer deployment from my experience.

In fact I don't think I ever saw someone using a non-LTS version.

Canonical certainly has these stats? Or someone operating update mirror could infer them? I'd be curious what the real world usage of different Ubuntu versions actually are.


I'm late to the thread, but it should be well-known that nothing in Ubuntu enjoys substantial testing, in any release. They squander the entire release cycle, then they "freeze" the release, after which they pile a huge amount of changes into the release in contravention of widely-held definitions of the word "freeze", then they cut the release before anyone has even tried it.

The above is absolutely what happened in this case. The package was at revision 0.1 throughout the release cycle, then 12 days before the final release, 6 weeks after the freeze and after the beta release, they changed the package to 0.2 with no notes other than "new upstream release". Nobody had time to try it, they just shipped it out to everyone literally without even trying it once.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: