Hacker Newsnew | past | comments | ask | show | jobs | submit | 10x-dev's commentslogin

Crafting Interpreters by Nystrom.

It's great because of how accessible it makes compilers to average programmers, and, as a bonus, it also throws shade at the overly dense compiler textbooks:

"Classic compiler books read like fawning hagiographies of these heroes and their tools. The cover of Compilers: Principles, Techniques, and Tools literally has a dragon labeled “complexity of compiler design” being slain by a knight bearing a sword and shield branded “LALR parser generator” and “syntax directed translation”. They laid it on thick." (chapter 6)

Having read Compilers for a compilers course in college, Crafting Interpreters was a fun read on many levels.


I don't think that is in the spirit of this list at all, no matter how good it may be. This is about seminal papers advancing the state of the art in the theory of programming languages.


It's certainly possible to build a compiler in C for a language that provides zero cost abstractions.


The same way it is possible in Brainfuck or powerpoint — that’s just Turing-completeness and outputting a binary file.


Sure, but your statement about C remains false.


Being pedantic, it is either true and completely useless, xor false. Feel free to believe any of the options, it won’t make a sane vector implementation in the language possible for generic datatypes.


Your statement that C cannot be used to implement zero cost abstractions is factually false. There are no other options here. You said something that is incorrect.

Edit: downvoting will not change that


Are joins in a 5NF database now as fast as querying a denormalized database?


Maybe the joins are faster? It's really hard to tell without more context/detail.

I think many were burned by mysql back in the day - trying to use sql as a document database - or using php frameworks that happily did a hundred queries pr page view.

As a general rule of thumb, for a REST app - I'd say the db should be normalized, and the cache layer(s) can handle the denormalization.

Ie when you get /page=1 varnish can spit out a response from ram (which if you squint, is a denormalized projection of your data), or it can go talk to your app, that talks to the db. And the latter is most likely fast enough (tm).


Maybe I'm missing some context, but isn't that true by definition even if the db does nothing special? You either spend time sending N queries and waiting for responses, or join and use one query. Given actually matching scenarios for both, the one with less communication overhead wins.


In a normalized database that's true, but in a denormalized database, by definition, you get a third option, which is to have tables with redundant data that can be returned in a single query (as if it were pre-joined, I suppose).


Depends. Denormalized means the database contains redundant data. If a query have to scan 10x or 100x as many rows due to redundant data, it is obviously going to be slower. But it is hard to say anything general since denormalization will make some queries faster and other queries slower.


with good index you will not scan more rows.

But each query will use a different copy of the same data instead of joining with the same copy.

Storing both copy in memory take more space so you can’t cache as much in memory.

I’m not talking redis or memcached but the page cache inside the sql engine.


they always been faster! When you have 5NF, the database is smaller and all the row you join will be in memory in the SQL server PageCache.

While when using denormalized database, your read will have to go to the disk.


Seq scans will be faster in a normal form database, if you're seq scanning then joining other tables on an index it might be faster. Otherwise the denormalized table will probably be faster.


I work at Meta and I use tools written in bash, C, Java, Go, TypeScript, Kotlin and Dart daily. I've definitely noticed that some C++ tools are moving to Rust. I would have preferred to see D in its place, but it is what it is.

Personally I'm not a fan of Rust, but it is interesting to see how the language is growing as it hits mainstream. Lots of similarities with Go back when it was cool and hip.


Not that you need to be a fan, but what are your reasons for not liking Rust? As a Rust developer, I'm always curious.


I think its killer feature is the memory safety guarantees, but other than that, I just don't find it as readable as other modern languages, mainly because it's symbol heavy.


Fair, but like any language this becomes a non-issue given enough time with it.


I've never liked the "you'll get used to it" approach to life.

I prefer to make my life harder by pursuing the perfect solution. It has bitten me in the behind multiple times in the past :)


Are your use of Dart related to flutter?


Yeap, our team has a few quick-and-dirty GUI tools in Flutter


I loved 50% of my Master's degree program. Those were the best CS courses I ever took (writing assemblers, compilers, CPUs in FPGA, OS kernels from scratch). I could have presumably avoided the degree and done it all by myself, without the knowledge of a professor, without the support of the TA and without feedback from others, but it would have taken at least twice as long and I am not entirely sure I would have been able to get to the finish line.


Agreed completely. Well-taught graduate systems courses are fantastic. Additional courses might include computer architecture (though maybe that was your FPGA course), networking, graphics, databases, numerical computation, parallel programming, AI/ML, etc..

The primary advantages, as you note, of taking a formal course are that it is well-structured and you get feedback and support.

Secondary advantages include a potentially positive and motivating learning environment, meeting and interacting with instructors and other students, increasing your portfolio of completed projects, and a potentially useful or beneficial degree or certification.

Overall though I'd say the main advantages come from completing the course projects.

If you have the time and motivation, you can teach yourself from the same material, but that usually requires more time and motivation.

It's a shame that formal and self-directed education are often seen as being in opposition to each other. As a field I think we should support and encourage both.


This is my biggest issue with working at FAANG (been at 2). Lots of people just don't give a shit. To paraphrase the Silicon Valley show: "you got your RSUs now fuck off for 4 years". I can't fault people for making the best financial decision for them, but for crying out loud, give a shit about the code. Write the unit test. Write the docs to explain the architecture. Refactor the code while you're editing that file. Think about class and method names. Give a shit.


I've been working with startups for awhile and I never got the chance to give a shit. Not even when I was CTO. So many external pressures, deadlines, hacky releases to demo to whatever investors. Responding to A | B testing. Firefighting. And the list can go on.

The only time in my life I had the possibility to give a shit was when I was working for shit money contracting for the government. I've never been in that situation again where I could spend as much time as I needed until I delivered to the quality I desired. Architecture diagrams, properly planned executions, testing etc etc. Much slower moving than startups but I trust the systems I wrote to continue saving lifes as they have done until now. Most of the code I delivered for startups, I don't even trust at release, what can I say about decades down the line...


For what it's worth, working at Amazon for a bunch of years now, this is the highest percentage of people that truly Give A Shit, I've ever encountered.

I know it's not universal but in the parts I've worked in, it's intoxicating.


Being one of the few people who give a shit in the middle of a corporate culture that, as a whole, does not, is a very good recipe for fast burnout.


My 'favorite' silly thing PMs do is UX research studies (typically on 5-10 people) and essentially ask completely untrained people if we should go with X/Y or Z. It's a super-effective way of avoiding responsibility for product decisions ("the data suggest we should go with Y"). If only building good products were as easy as asking what customers think they want.


Either they're doing the UX research wrong or (more likely) you're misunderstanding the process. You don't ask them if you should do X/Y/Z. You ask them to do X in the program, and see that none of them can find widget Y which controls it because they keep clicking on widget Z.

It's about observing the users fumble through your UX when you know their motivation.


> It's about observing the users fumble through your UX when you know their motivation.

Some time ago we did such a test. We called 10 customers to our offices and had them do some flows in the application. They didn't fumble. They pretty much did what they had to do and left positive reviews.

That whole thing got scrapped because consultants convinced our CEO that qualitative data is not good for global scoped startups, and that we should be building based on quantitative data.

Honestly, in less than a year, our customer experience was already taking a dive because all the extra little features we would add and strange UI elements, it became a confusing mess and our tracked NPS (Net Promoter Score) showed that. I've since left the company, but I check on them from time to time and they never really recovered and continue doing A | B in the hopes of hitting that sweet spot. It's just an unrecognizable monster at this point in my opinion.


Data analysis is the lowest common denominator of business thinking: the simplest, easiest thing that feels meaningful and objective. Anybody can sum up two lists of numbers in Excel and see which one is bigger.


I wish the problem were my misunderstanding the process, because then I could fix it easily by learning more about the process. I do get where you're coming from though.


only listen to customers problems and never their solutions


I bought one of these laptops. It came with Windows 11. I followed the steps on Ubuntu website to flash a USB drive and the laptop didn't boot from it, despite selecting to boot from USB. After disabling Secure Boot from BIOS, I was able to install Ubuntu. Windows continued to work just fine.

iirc whatever cert grub was using has been blacklisted by Lenovo because of some recent security issue in grub (I can't find the details right now).

Whatever your stance on Secure Boot, this increases friction and raises the tech bar for people to install other OSes. I imagine that even a "power" user wanting to try Linux would be very confused and would probably give up after not being able to boot from USB.


Was there no user friendly error message?


Nope, an error quickly flashed on the screen (something like "Error 22", no real info) before the laptop proceeded to boot Windows.


It is objectively not ludicrous. It might go against the commonly taught idea that businesses should focus solely on generating profits, but it is not unreasonable to create a system where businesses have to keep the common good in mind.

There is a difference between 'how things are now' and 'how things could be'. Imagining and wanting a different status quo is not by itself ludicrous (especially since we all stand to benefit from such businesses), it's a first step towards change.


It's not just a "commonly taught idea that businesses should focus solely on generating profit", it's the fundamental principle upon which economies are built today almost anywhere in the world. Sure, there are other ways of organizing economic systems, but to suggest that we are simply or easily going to switch to one is unrealistic. Imagining and wanting a different status quo will not lead to a different status quo, especially if all we are doing is making demands on others to change their behavior and use their property in ways that we want. In other words, if we want a different status quo, we won't get it by bitching about GitHub but by building a competitor company that does things the way we want it.


> It's not just a "commonly taught idea that businesses should focus solely on generating profit", it's the fundamental principle upon which economies are built today almost anywhere in the world.

It's hardly the fundamental principle. The fundamental principle is that people need things to survive and its more efficient if people specialize and trade than if everyone creates everything they need.

The pervase idea that businesses should focus solely on generating profit is also directly responsible for lots of problems almost anywhere in the world from driving out less vicious competitors to rent seeking to externalizing costs to everyone else e.g. via pollution.


I think you're actually both right, in different ways.

Fairly self-evidently, the sane fundamental principle for a business is "make a good/provide a service, and if you do so well, you make a good profit".

Unfortunately, for the past few decades, businesses in the Western world (and particularly the US) have increasingly been operating based on a fundamental principle of "make as much money as you possibly can, and if you have to make a good/provide a service to do so, that's a necessary evil".


I did not suggest it would be easy and also, someone has to imagine a competitor company before it can exist.


D is by far my favorite language. I put in a lot of effort to learn the language and thoroughly enjoyed discovering D's elegance. It has super clean solutions to all sorts of language issues (e.g. obj.foo() is just syntactic sugar for foo(obj), which gets you both type extensions and OO-looking methods on structs, which I miss in C).

That said I really tried to use D for my projects, but I had to give up for a rather surprising reason: the C interop is so good, that 1) most libraries provide a 1-1 translation of their C APIs, which ends up being ugly, non-idiomatic D that forces me to think in both C and D when coding; and 2) debuggers are not aware of D types and idioms, so when debugging, I have to again think in both C and D. Both of those add up to about 90% of the coding time, which is to say that, 90% of the time, when using D, I felt I had to code in 2 languages at the same time.

I'll skip some of the other issues I ran into, because I think a lot of the problems with D would go away if it had a large active community that would put the work in to maintain the D ecosystem, but that's a bit of a chicken and egg problem.

In the end, I decided that for me the reduced language overhead, solid ecosystem and modern conveniences of gnu17 C were more valuable in practice than the sweet features that D had to offer, and that made me a little sad, but I'm hopeful that one day D will make a strong comeback.

I know Walter gets notifications when D is mentioned on HN, and I imagine that if he read through this he'd shake his fist at me for saying interop-so-good-its-bad, but, if I could make a parallel with Java, I'd say that in code that uses many 3rd party libraries, D feels a bit like coding with JNI all the time (sorry). Ironically, in my opinion, D would benefit from having a community that rewrote popular libraries, instead of primarily relying on C interop.


The neato thing about obj.foo() goes even further. Ever seen code like:

    a(b(c(d(e),3))
? Not very readable. UFCS (Universal Function Call Syntax) enables it to be written as:

    e.d.c(3).b.a();
Reading it flows naturally left-to-right.


Thanks Walter, that's a great example and I like that, because parens are optional, it could also be written as

    e.d.c(3).b.a
which is even cleaner.

For those who are thinking UFCS is a trivial detail, consider that the shell and some other languages have pipe operators (|>) to make the code flow intuitively the same way as the data.

In my opinion, in a C-like language, managing to squeeze so much functionality out of the '.' operator without any downsides is the mark of a well thought out, elegant language.

Thank you for creating D.


It is cleaner, but unfortunately it is not explicit. It is a function or a variable? I used to love those things until I noticed its defects.

For example, in C++, a = b can invoke anything. Not sure it is a good idea (except for generic code, there it is useful).

Zig has a philosophy of nothing hidden that I think it is mostly good.

That said, I find D a very nice language, the only problems are:

1. small ecosystem 2. last time I tried, packaging of download and use was... improvable.


Isn't it amazing how the . operator can cleanly replace :: and -> too?


Absolutely! Speaking of operators, out of curiosity, what's the reason for using '!' with templates?

Naively, I would think that making template instantiation look the same as a function call would be a desirable feature, with ambiguous calls needing to be resolved by the user.


Not many characters left in ASCII. Reuse of an existing operator almost required. Binary operators cannot be use as it would be grammatical ambiguous and would need resolution at the semantic pass which is a big no-no (that's why C++ is so slow at compiling, it cannot be parsed without semantic analysis). This left only the two exlusively unary operators !, ~. As ~ was repurposed for string concatenation, only ! remained.

    templ!thing(a,b)   
I would have thought that templ(thing)(a,b) would have been a good solution, as it is what is used in the declaration/definition side of templates, but this would have made removing redundant () not possible in UFCS expressions.


i would have preferred:

    templ<thing>(a, b)

because with that:

    templ!thing(a, b)
is it a function?, or are you calling thing's function?

you can do:

    templ!(thing)(a, b)
but did you mean?:

    templ!(thing(a, b))()

i personally always use !(), no matter what, and it's annoying to type, i don't want to waste time constantly trying to figure out what is what, it's mentally draining


debugging has significantly improved, it works great for D types https://github.com/Pure-D/dlang-debug

kotlin became very useful for focusing on being able to consume Java code

it allowed them to have a huge presence on android, that's enabler

it profits Zig as well

not everything needs to be ranked #1 in TIOBE index

there is value in being the way it is, it's organic, and no companies get to control its faith

> the C interop is so good, that 1) most libraries provide a 1-1 translation of their C APIs, which ends up being ugly, non-idiomatic D that forces me to think in both C and D when coding;

what do you mean? it's the same, function and data

    struct Data {}

    do_this(&myData);

this is valid D, it's also valid C

the problem i think you have is you are abusing OOP and think it's the only way of doing things, which is wrong, and this explain the sad state of software nowadays ;)

but it's weird when you then say you decided to stick with C, you contradict with yourself


> ... you are abusing OOP and think it's the only way of doing things

> ... when you then say you decided to stick with C, you contradict with yourself

Figments of your imagination.


Thanks for your comment. Please rest assured I am not abusing OOP, and am absolutely not contradicting myself by choosing C.


those pretty-printers are useful, thanks!


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: