Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's great when the feature set you're talking about has already been validated by the market. When it's not, it's a mistake to focus on that. I believe we call that "premature optimization."


Many will not evaluate software that is unstable, because that is a clear sign the whole effort is not yet suitable for any meaningful time investment.

Play it how you will, of course. I will pass hard personally. Lack of stability is very high on my qualification list.

To be blunt, where it exists there are other priority problems.

One may just be flashy speed and other sizzle done at the general expense of the solution being robust.

May only take an event or two needed to cancel out potential gains from those things. Turn that crank a time or two and many will just get back to work.

An attempt or two again? Many will ignore the solution and or have adopted something else.

Net result? The team that did not make stability a priority likely did a lot of market building for the ones who did.


>The team that did not make stability a priority did a lot of market building for the ones who did.

I personally am very frustrated by this constantly, but there is no way around it that will always work. Someone has to pay the cost of market building, it is a separate cost from stability, and sometimes you can't afford both.


There isn't.

But, what does work far more frequently is to get the solution robust early. The longer it gets put off, the more difficult doing it is, both in time, cost, users being requited to do things along the way to support it all.


I agree with you, but still someone has to convince the customers to pay for that, and sometimes they won't. I've been on far too many projects where the customer won't even pay for unit tests and gets angry at the suggestion.


Well, is that a contract scenario?

If so, they can pay now, or pay later, just make sure you manage expectations so they actually do pay.

If, somehow that ended up on you, of course they refuse! Why pay for a free lunch?

So don't allow that. Seriously.


You're preaching to the choir. I wouldn't work on such a project again, but when you say "manage expectations" what usually happens there is the customer realizes it doesn't match their expectations and then goes somewhere else and finds something that's in their price range. The point is, there is a class of customers that aren't going to pay for this. Ever. They either can't afford it or have decided it's not valuable.


So let them. That's what I do. And I do that because someone, somewhere will pay now, or pay later.

Maybe the customer finds someone hungry enough to pay for them.

Or, maybe that customer pays hard going through this cycle a few times too.

Not my problem and it does not have to be yours.


To be clear, I was mostly referring to low-or-zero-cost products mentioned previously like gmail, outlook or excel, or any random zero-cost piece of open source you can download on github. If you depend on that stuff then it is your problem, that's a signal that you won't pay the extra cost for stability. The businesses that can afford the extra stability are not using the consumer offering, they pay Google/Microsoft for a long term support contract. (And yes, Xorg, Wayland, GNOME, KDE, and all the other open source Linux desktops are all under the category of random free project on github)


They are making total bank on those things. And the users are paying, just not with dollars.

Can definitely be robust and should be.

In this scenario, it is a mutual problem. Nobody has to use the software, true. However, it again is all about managing expectations.

If it is oversold, people will end up with bad experiences and that does impact who delivered them.

This has already happened. Will again too.

On a short timeline, maybe no big. On a longer one, higher order impacts may actually be painful.

Say a decade of negative public sentiment sees a brutal regulation kind of painful.

Entities putting billions in the bank can and should make reliable software.

There is a YUGE diff between Gmail, Excel and friends, and rando OSS.

This discussion ranged far away from the initial premise.

So, let us take it back a bit.

Weyland. Wants to be the defacto display solution. Does not have the same priorities many users have. Also does have priorities many users have.

Replacing X11 was made an explicit intent too.

Well, if it does not actually do that?

"All that Weyland shit talk" is the outcome.

Maybe subtract the replace X11 part, or maybe find a way to actually replace X11, or maybe just ignore the shit talk?


I do not agree all those are the same as rando OSS on github BTW.

Because of that problem, I have always maintained we need a way to better fund some projects.

I would love to see the display system funded and all the good stuff worked out properly. For me, the network features and general flexibility of X is worth some money.

When I ran SGI machines, that money was paid and the display systems were flat out awesome.

If it does not do those awesome things? Meh, can just run Win 10.


I believe there are still people maintaining paid X servers on Windows, on Linux not so much.


There are, last I looked.

I used Exceed a lot. Exemplary. That program did everything nicely. I even used a CAD app on Windows that actually was built for X and it was network transparent same as any Unix box!

There is a sub $100 one out there. Maybe a bit more now. Maybe gone too. It was very good.

The freebies are respectable and great for sysadmin.

Funding a first class X11 replacement that also does some new things people want for mobile and embedded makes a ton of sense to me, and I would sign right up too.

Maybe getting a bunch of us tossing a tenner a month at this would fund a project that just gets it all done.

And we need one.

It is obvious to me X11, used as it can be, sees way more than anyone may have thought today. When I was on IRIX, X11 was flat out awesome. Shit just worked, and was fast, and 3D graphics over the wire smooth, fast, performant.

It took a long time for the rest to get there and when they did, nobody really grokked it, unless they came from a solid implementation.

And nobody from the PC, Windows camp got it. Did not even understand what an app server even was.

No fault here either. People saw what they saw, worked how they could work. I get that.

But today, people still saw what they saw, work how they work and ripping all that up is painful enough to not make sense.

If it did, we would have seen this go differently.

So maybe this should not be a volunteer effort. Matters too much to way more of us than I also believe was expected.

If we fund something thwt does what X11 can do and that also will do the new, spiffy, lean things people want today, we would have a killer environment.

Having that be OSS?

Worth it to me.

Or, we can watch the carnage for another decade...

Good discussion, BTW.


The only real option for a high-end Unix workstation with heavily integrated software that still exists seems to be the Mac Pro. Linux is really not comparable to IRIX.

People keep saying that Wayland needs to be more of an X replacement but it's never clear what that means. You want 3D graphics over the network, that currently means Vulkan, and if you were going to do that, there doesn't seem to be much reason to tie it to any particular window system.


I do not know what it means either. I am just gonna think out loud a bit, because I do care about this stuff. But, am also not currently impacted much. Nice place to be for me at least!

In a general sense, Wayland is not Unix like, in the same way systemd isn't. X11 is, as were various init systems. And fact is, at the time those things were done, multi user computing was a thing, and it was all done with those ideas in mind too.

At the time, micros were on the rise, and apart from a few exceptions, were single user, and started out single tasking.

Two different roots of thought, now well grown into trees that share intermixed and intertwined branches.

I know that is an issue. Whether it needs to be addressed is an ongoing matter. A matter I am happy to discuss, but not currently one that impacts me too much right now.

My focus is pretty far away from a lot of that. Doing lower level stuff, and often with no OS, embedded. So, write it, include it, port it if need be, then compile and deploy it.

Minimum requirement is gcc, command line, terminal. In all other areas, Win 10 is fine. I really could give two shits what OS it is otherwise. Machines are all fast enough, big enough, etc... good times (for me) actually.

Again, not dismissive, just not a focus at the moment. That could change and I might care a lot too. This kind of thing happens to people. Sometimes often.

When I use X, I tend to make good use of the network capability. It is the primary reason to use it, along with ways, means, tools that have worked forever. I value stuff like that very highly because I much prefer learning and using new things to me, not always remapping stuff I know cold.

For example, in the past, I have enjoyed weaving a single desktop environment from a pile of machines and software. Did that to support, sysadmin, train and develop for advanced engineering software users of all types. I had the industry on my desktop and could run or do almost anything on a variety of platforms from that desktop and the time savings on that as well as my general potency 2as off the charts good. Beat a lot of other people who did not really grok X too.

Log in and just do stuff, and depending on what gets done, a few machines, or just one gets the job done, served up to me nicely, consistently.

For a long while after IRIX was done, I used a machine as my head end. Just worked great for that kind of work. Pretty much nothing was as good since, and apps have become far less flexible too. A pile of VM's are required for that kind of thing today and it is a lot more work.

But, it is also a lot more lean hardware wise too. That is significant right now, though I do not know how significant given how cheap stuff can be.

Anyway, back to the UNIX part. Small programs, smart data, pipes, all that stuff is UNIX. X11 allows all that to travel where it needs to and it is a very nice way to work.

If one wants, everything can be distributed too:

Fonts from one box, X Server from another, application on yet another, running on shared files from yet another, window manager on yet another, etc... An Indy used to serve up its nice window manager for my Linux for quite a while, just for shits and giggles.

And that is the UNIX way.

Sometimes it is powerful. I see people struggle with managed data, engineering software, etc. for example.

under X, I just made a beefy box, one copy of the app, one database on local disk, one backup system, and on IRIX that could be done live while people are working with just a few set and forget scripts too.

Then, serve 20 to 30 users. They do not have to know anything other than how to use the app. And they can't really break it either.

Admin on that thing was mostly cake and the users could run whatever they wanted as long as it had a respectable X server, they were good to go. Few of them would even notice or care, and it worked great over 100-T networks too. Kind of crazy to think about today.

The nice thing was those users could not touch the data except through the app. This saved so. Damn. Much. Pain. Worth the whole setup right there.

So that is one case. That case made me a ton of money too. Edit: That case still happens online, but alsonsuggers from being online. Outages, updates, fails on the part of the provider, data being leveraged in distasteful ways, sometimes expensive ways.

Done local, with X? Total control. It is nice.

Application servers. Loved them. And with super expensive 5 figure software? Making that available and performant to the people who need it without all the installation hassles is golden. Really do miss that.

All very UNIXEY.

Other cases include multi head machines. Beefy machine shared by two people. Was pretty easy with X. Not sure what the state of things is today. Probably not good. Probably not needed either. We got crazy amount of capable cheap hardware these days.

Basically, X is multi user graphical computing and all that can mean.

Current display systems are not.

We have moved quite far away from multi user systems, frankly. The basics exist and work, but we have no display systems that work in the same way. And very large numbers of people do not even know how they could work too.

Honestly, that is probably the core issue in play, boiled down!

"No reason to tie it to a window system"

Well, given a multi user graphical display system, "particular" kind of becomes moot. Run the window manager you want, etc...

In any case, here is what I think will happen:

X11 has always had this basic problem. People who did not use it, or who cut their computing teeth on single user dominant computing think very differently from those who did.

And it is not just about displays. The same thing is true for the command line too. Watching people come from DOS/Windows onto a proper multi user environment showed these differences in thought to be just as stark as those same differences remain stark today in the display context.

Most people know what multi tasking is and expect it. That is the dominant mode today.

Far fewer know what multi user computing is, and again I am speaking from the perspective of someone who took full advantage of multi user computing, and in the graphical as well as the general sense.

And that is the clash, in my view. A secondary one is breakage of a lot of little time tested, production proven ways and means of doing things.

Focus follows mouse + middle button paste is amazing when working across various windows that may overlap, or exist across multiple virtual desktops, for example. I can tell you, that is a super hard one to get away from. I did, and have, but it hurt.

There is a ton of this "noise" and given the state of discussion, appears to add right the fuck up. Still.

But I am rambling:

What will happen is we will either prove out the idea of multi user graphical computing is not useful, or we won't.

If not useful, all this stuff will eventually go away and we will bring up the next set of users on whatever that paradigm looks like in the end and that is that. The rest will be history.

Sort of like how we see mobile dominant users and desktop dominant users today. And that is far from settled. Mobile is nowhere close to displacing desktop. I use both hard too. No way.

But, if what gets made really is not so superior? Like those "precious use cases" really do hold significant value? Or, even if they could, but die off for a while?

Then we will do all this over again, reinventing X11 just like people tend to reinvent UNIX.

And look at the current trend to serve it all up via browser! Seems to me a lot of great 70's era ideas had real legs.

And this will happen because tech never, ever really dies. Someone, somewhere will see it and build on it to complete and it is game on. Or, they will think that way again because the line of thought has merit.

I plan on watching it all with great interest.

Personally, I believe in multi user graphical computing and have seen, applied, and profited nicely from the value it has. Everything should go over the wire, just like everything is a file. So I have something I want to run on my phone? Just ask for it on the display I am using and go. That's X11. I think that kind of thinking is compelling and I think that because I got to do that and it was sweet compared to the usual, find it, install it, configure it, move / copy data to it mess.

Cloud computing does that with a browser nicely. But, it also means a few people get to rule the world too.

Whatever one may think of that, if we were to build multi user graphical computing in from ground zero, I believe it will endure for a long time, just like UNIX itself has. And it will because the core ideas are strong, very generally useful, etc.

We have seen this kind of thing before too. Really great ideas tend to stick around.

And this is not about the people doing Weyland not having great ideas at all. They are trying hard to bring some new ideas into play, much like the X11 people did.

Cool!

It is more about trying to convince everyone that the great ideas behind X11; namely, multi user graphical computing, aren't great, or can be replaced with something that fundamentally is not multi user graphical computing.

Given all that, Weyland NOT replacing X11 may well have set different, and maybe less conflict laden expectations.

But that would also have begged the question: what value does multi user graphical computing have? We may be in a better place had it been asked and answered.

I think it has a lot. And I think, like UNIX, it is most important that it can all be done, not so much that everyone needs to, or will benefit from doing it, or being forced to do it.

We shall see!


There's someone else who thinks like me, and well put. Worth every minute of attention.

I remember my early encounters driving me absolutely mad because I did cut my teeth on that 1 seat 1 user environment.I think where I've ended up going in a different way than most is that over the years I keep cracking away at it because I want to understand it. It's an intuition thing, and you never really appreciate the miracle of it until you sit down and come to terms with what X really did. It was the glue between app logic, state, and a constantly floating set of goalposts represented by every possible set of different hardware.

I haven't even torn into the source. It's taken me so long just to grok the other layers of the problem space. Personally, I think the space X11 occupied is one of the most enticing areas of research open to me, even if I never seem to make the time for it.

Abstraction-wise, we assume we carry context with us everywhere, but an X environment driven over a network breaks you of that quite handily, and I don't think the "Cloud"/SOA solves the real life dilemma jist by lifting everything into a browser.


> Good discussion, BTW.

Agreed.


And there is an argument for not being able to afford the effort in that scenario too.

At the least, risk will be higher than it needs to be, and that puts the other validation into question as well.


Secondly, say growth in user / subscriber base is a high priority.

Often is right?

When does robustness end up a priority?

Could be a very long time, and or never, particularly given the core value proposition and or features intended to deliver it are not themselves validated by "the market" as you say.

One solution to this is to pay people to adopt the solution.

Another is to deliver features more specific to them.

Yet another is to cover it up, make recovery robust, transparent, etc...

These are all perfectly fine ways to approach beta type cases, initial userbase, seeding for growth.

After that process has shown what the market finds compelling, stability should be right there in the list with the core set of software to build on.


In some domains stability and correctness is a feature.

You can't omit stability in life-critical software for example.

On the other hand software's usefulness can outinfluence unstability, if there are no alternatives.


Serious question: when are there not alternatives?

The point being made here is not that it always must be stable.

It is all about how often stability is not a priority when it could and or should be.

And in the no alternative scenario, it can be compelling to continue to ignore robustness to get growth and lock in.

When that happens, stability, robustness often stay in the back seat for a large fraction of the product life cycle.

For any given user, the impact may be just low enough to keep them on board too. New features, and asking more money for them, or offering them to make up for that impact can be an ugly cycle leading to users ripe for the picking later on.


> when are there not alternatives?

Example: Steam cloud save. Countless time I have to force quit Steam's background service because Cyberpunk 2077 and GTA V freeze while saving.

Steam doesn't show the launch game button, even though the game is killed from task manager.

There are no heartbeat like mechanism to keep in check the game process state.

And there are no alternative to launch the game that I bought from steam, so the instability becomes little of nuisance.

But that's a minor issue where unstable features are unimportant ones.

> It is all about how often stability is not a priority when it could and or should be.

And I'm on your team too. That's why my argument is that stability IS A feature too. Countering grant parent's argument that says stability is ignorable as long as the features are validated by the market.

> And in the no alternative scenario, it can be compelling to continue to ignore robustness to get growth and lock in.

As a paranoid user I agree this has been a nightmare. I constantly have to make sure I wont get locked in to services that I pay for, and often time I find myself using open source solutions rather than paid one because of the same reason.


I agree, and was very unclear and incomplete with my question. One alternative is to simply not use the tool. Can be rough though. Your pain is acute!

It absolutely is a feature.

Guess the externalities problem crops up in this context.

How to actualize those costs better so the equation makes more sense?

And yes! I hate lock in viscerally. Frankly, will avoid it at very considerable cost.


> How to actualize those costs better so the equation makes more sense?

Which equation exactly? The balance between effort to stability and other stuffs? Sorry I'm a bit lost


Hey, if you have the kind of money that medical companies spend on testing and stability, and you want to spend that on Linux desktop environments, I won't complain. Although it probably is better to spend that on developing life saving devices than it is on fixing outdated desktop paradigms.


Sidebar: desktop paradigms are hardly outdated.

There are competing paradigms now, and that is good, but they have not yet actually reached use value and productivity levels where it is appropriate to say the desktop paradigm is out of date.

At the worst, both will become more generally flexible to keep their value to the users.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: