Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Do you have any insights on why Lua didn't win?


I think the very reason I like it is the reason it didn’t crossover to mainstream success - it is so minimal it couldn’t compete with the all encompassing standard python library.

NodeJS is an obvious counterpoint to my thesis - it’s minimal stdlib as augmented with tens of thousands of NPM packages to implement every feature. This is my preference, but maybe Lua was too early for this strategy to succeed. The vicious attacks NodeJS gets (read about the leftpad.js debacle) because of their philosophy shows that a lot of people disagree with me, but I like the small language augmented with a rich third-party ecosystem.


Python and NodeJS both beat Lua because they were scripting languages involved with web servers and web clients, and the massive influence of the web on the job market meant that everybody was going to want to learn Python (and Javascript) and then try to apply it blindly to everything.

It doesn't matter if either language might not be competitive with Lua or any other language in terms of design / features.


I would agree with this point in respect to Python, but not Node.

I think Node was in large successful because web developers already have to use JavaScript so the promise of unifying on one language and set of libraries is appealing to many.

OpenResty and Lapis are quite capable in the web space, but they haven’t seen widespread adoption.


What I had drawn from the leftpad event in regards to ecosystem was the improper usage of third-party libraries; depending too much on libraries without actually validating what work they were doing, and how well they were doing it, and the realization this was common behavior throughout the community

I don't recall anything particular to comment on small vs big std library. The above complaint would apply to either case


One big reason is language stability. Lua changes the language in backward-incompatible ways. It’s part of how they make the language so clean, but it makes it difficult to build an ecosystem around.


I'm not OP, but I couldn't get past the fact that arrays start at 1 rather than 0.


While many languages follow a 0-indexed convention, as you see here, not all do. Another couple of cases I run into are Postgres arrays, and switching between “first”, “second”, and “nth” in Clojure.

If it helps, you can think 0-indexed as referring to the offset, and 1-indexed as referring to the position. It’s a shame to dismiss the entire language due to what one might argue is a minor aspect.


I would not call Clojure 1-indexed though because of 'first', it uses 0-indexing for list, vectors, and arrays.


To be clear, I’m not calling Clojure 1-indexed: I provided “nth” and “first” as examples of contrasting 0- and 1-based conventions; in this case within the same language.


Clojure being a lisp also has different enough syntax it is easier to jump back and forth because you're doing way more context switching. Lua looks enough like other c-style languages you can forget it is 1 based and make all sorts of nasty mistakes.

You can certainly argue it is unfair, but it is still something that increases the risk of mistakes.


Replacing (first coll) by (nth 1 coll) in clojure happens to me more often than I'd like to admit


I mostly write in Java for work. Back to the good old high school days, I spent 2 years learning programming in Pascal (yes, Turbo Pascal for DOS!), so seeing arrays started at 1 is not something strange for me :)

Of course, in Pascal you actually can pick any number where the index starts/end.


Totally agree with you. Zero-indexed arrays are one of those language design dogmas that are simply set in stone, really awkward to switch between languages when Lua has such a fundamental difference of opinion.


They don't have to, since Lua tables are hash sets and you can use anything as a key. You can start them at 0 if you want. table[0] = "foo" is valid and works. The downside is you have to add '-1' in a few places, but it's not as many as you'd think. All of my Lua code uses zero based arrays. I think the only place I had to add -1 was initializing numeric for loops.


Since Lua “arrays” are just tables with numerical keys, couldn’t you just start at 0 if that’s your preference?


For me it was the fact that the array ends at the first 'nil' you store in it. 1-based arrays I could get used to; that was just maddening.


The lack of a large standard library and a decent package manager were key in my mind.

I really love the simple elegance of Lua, but the build everything yourself ecosystem made Python and Ruby naturally better choices for larger non embedded solutions.

These days the package management problem has been solved with LuaRocks, but many packages are unmaintained.

The language is so old that it’s place seems to be set and unlikely to change.


Perhaps there was not a race. Lua is a great language; Python is a great language. Both have a place in my tookkit. If you are into language design, Lua is a language worth studying and understanding.


It didnt say- oh, btw you can use js functions and packages as library in lua.

https://github.com/PaulBernier/castl




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: