It does make quality assurance an absolute nightmare, I would know, our application is like this to the 10th degree. Config on top of config on top of setting on top of options.
But if you also want your product to be productive for a way array of use cases, it's necessary. You need to think about your market.
Which is why you should think about how these options interact and compose at the start, as opposed to only adding options in an ad-hoc manner (whether you do it willy-nilly or only when your arm is really twisted)
"You mean we shouldn't use 10 layers of abstraction and 274 libraries to achieve our goal ? I mean, we use a lot of resuources, but look how polished the UI is: everything is flat. "
Thank god the RAM prices have risen. Maybe some people will start to programm with their heads instead of their (AI) IDE.
It's getting worse every time I am forced to write now too, since I do it so infrequently. I type nearly everything now, and I only write stuff with a pen (outside a signature) roughly two or three times a year, and every time I do it's more difficult to figure out what I actually wrote.
I wish there were an easy way to print stuff on the go, and then I'd never have to use a pen again; maybe as we get to a paperless society that'll be the case.
I understand in part what you mean, but ubuntu is also a disto that many users relay on for its stability and reliability, not just for being an hacker toy.
I still use like that and anyone can, but normal users are the main target.
Ubuntu is a classic distro, meaning hard-to-automate for deployments with invitation to buy services for that purpose, so offer no advantage in stability and reliability compared to modern declarative distros like NixOS or Guix System with their essentially read-only system, easy custom deploy and replication, easy rebuild, as a fresh install every time, easily poor man IllumOS Boot Environments with their linked generations.
"Normal users", meaning non tech-savvy ones are Windows or OSX targets, because with Ubuntu they still need to use a terminal a bit more than Microsoft/Apple stuff, and they still have to deploy their own systems. For a bit more "power users" having to manually deploy an official ISO than customize it or keep it up polluted, an update at a time, is a NIGHTMARE. Try to upgrade a normal Ubuntu for few releases and you'll see things breaking, you fix the manually augmented the entropy. With a declarative distro any updated inter-release and cross-release is a fresh install out of your config, no forgotten fixes/hacks no leftovers.
Those who claim Ubuntu as stable are stuck in a far past, before declarative distros exists.
For example, in the gnome team we do the uploads first to debian then we merge them with the Ubuntu changes if any, but we try as much as possible to use the same sources for both.
No way. Flatpaks are clearly represented in the software shops of their adopted distros (I use Fedora and Pop!_OS, both of which use Flatpak).
From the software shop GUI, I can choose flatpak or dnf/apt from the dropdown. From the command-line, flatpak has its own commands (vs. apt silent under-the-hood behavior).
Flatpak is better than Snap. I use Flatpak for commercial software (Discord, Steam, etc.), but it remains my choice as a user.
The point is totally different: the purpose of both is upstream-managed distribution, consider individual distro like a container ship to be loaded, something not to care about, a commodity.
We know the arguments: often distro packagers are late to update, some projects are very complex to be packaged and demand gazillion of resources to be built, upstream devs on contrary will surely keep they project package up-to-date, sandboxing is good for safety etc. BUT we also know the outcome: 99% of such packages are full of outdated and vulnerable deps, they are themselves mostly outdated, since they are not packaged by the upstream devs who just publish the code as usual, and they have many holes punched here and there because a browser that allow to download some files but you can't open them in other apps is useless, as a pdf reader who can't read a file because it's outside the right place. Beside that you get a 30+Gb desktop deploy instead of a 10Gb, dab performances, polluted home directory, very scarce ability to automate AND all of them still need a classic package manager since they can't handle the base system.
So why them? Because SOME upstream do not want to allow third party distribute their binaries, they are commercial vendor. They NEED such system to sell they products ensuring they can work as a cancer in an open ecosystem, not designed to be a ship for something but a unique individual desktop anyone tune as he/she want.
That's why they are crap.
The next step to classic package management is the declarative/IaC one, like NixOS or Guix System. Those who want Snap, Flatpack, Appimage, ... just want Windows, with all the bloats and issues of Windows.
> Just like they wasted time and effort on Unity and shuttered it in favor of Gnome
Unity served well for years, it would have needed a rewrite anyway for the post x11 era, so indeed there have been wasted resources, but experiments are also important in technology, and many still love what was (is) the unity user experience.
> and they wasted time and effort on Mir and shuttered it in favor of Wayland
Mir is still there and it's used. It's now a Wayland compositor but it maintains its API, the different communication protocol doesn't change its purposes.
> and they wasted time and effort on Upstart and shuttered it in favor of systemd.
When Upstart started and was used no systemd existed or was designed, so it served many well for years. Not a waste.