It's great because of how accessible it makes compilers to average programmers, and, as a bonus, it also throws shade at the overly dense compiler textbooks:
"Classic compiler books read like fawning hagiographies of these heroes and their tools. The cover of Compilers: Principles, Techniques, and Tools literally has a dragon labeled “complexity of compiler design” being slain by a knight bearing a sword and shield branded “LALR parser generator” and “syntax directed translation”. They laid it on thick." (chapter 6)
Having read Compilers for a compilers course in college, Crafting Interpreters was a fun read on many levels.
I don't think that is in the spirit of this list at all, no matter how good it may be. This is about seminal papers advancing the state of the art in the theory of programming languages.
Being pedantic, it is either true and completely useless, xor false. Feel free to believe any of the options, it won’t make a sane vector implementation in the language possible for generic datatypes.
Your statement that C cannot be used to implement zero cost abstractions is factually false. There are no other options here. You said something that is incorrect.
Maybe the joins are faster? It's really hard to tell without more context/detail.
I think many were burned by mysql back in the day - trying to use sql as a document database - or using php frameworks that happily did a hundred queries pr page view.
As a general rule of thumb, for a REST app - I'd say the db should be normalized, and the cache layer(s) can handle the denormalization.
Ie when you get /page=1 varnish can spit out a response from ram (which if you squint, is a denormalized projection of your data), or it can go talk to your app, that talks to the db. And the latter is most likely fast enough (tm).
Maybe I'm missing some context, but isn't that true by definition even if the db does nothing special? You either spend time sending N queries and waiting for responses, or join and use one query. Given actually matching scenarios for both, the one with less communication overhead wins.
In a normalized database that's true, but in a denormalized database, by definition, you get a third option, which is to have tables with redundant data that can be returned in a single query (as if it were pre-joined, I suppose).
Depends. Denormalized means the database contains redundant data. If a query have to scan 10x or 100x as many rows due to redundant data, it is obviously going to be slower. But it is hard to say anything general since denormalization will make some queries faster and other queries slower.
Seq scans will be faster in a normal form database, if you're seq scanning then joining other tables on an index it might be faster. Otherwise the denormalized table will probably be faster.
I work at Meta and I use tools written in bash, C, Java, Go, TypeScript, Kotlin and Dart daily. I've definitely noticed that some C++ tools are moving to Rust. I would have preferred to see D in its place, but it is what it is.
Personally I'm not a fan of Rust, but it is interesting to see how the language is growing as it hits mainstream. Lots of similarities with Go back when it was cool and hip.
I think its killer feature is the memory safety guarantees, but other than that, I just don't find it as readable as other modern languages, mainly because it's symbol heavy.
I loved 50% of my Master's degree program. Those were the best CS courses I ever took (writing assemblers, compilers, CPUs in FPGA, OS kernels from scratch). I could have presumably avoided the degree and done it all by myself, without the knowledge of a professor, without the support of the TA and without feedback from others, but it would have taken at least twice as long and I am not entirely sure I would have been able to get to the finish line.
Agreed completely. Well-taught graduate systems courses are fantastic. Additional courses might include computer architecture (though maybe that was your FPGA course), networking, graphics, databases, numerical computation, parallel programming, AI/ML, etc..
The primary advantages, as you note, of taking a formal course are that it is well-structured and you get feedback and support.
Secondary advantages include a potentially positive and motivating learning environment, meeting and interacting with instructors and other students, increasing your portfolio of completed projects, and a potentially useful or beneficial degree or certification.
Overall though I'd say the main advantages come from completing the course projects.
If you have the time and motivation, you can teach yourself from the same material, but that usually requires more time and motivation.
It's a shame that formal and self-directed education are often seen as being in opposition to each other. As a field I think we should support and encourage both.
This is my biggest issue with working at FAANG (been at 2). Lots of people just don't give a shit. To paraphrase the Silicon Valley show: "you got your RSUs now fuck off for 4 years". I can't fault people for making the best financial decision for them, but for crying out loud, give a shit about the code. Write the unit test. Write the docs to explain the architecture. Refactor the code while you're editing that file. Think about class and method names. Give a shit.
I've been working with startups for awhile and I never got the chance to give a shit. Not even when I was CTO. So many external pressures, deadlines, hacky releases to demo to whatever investors. Responding to A | B testing. Firefighting. And the list can go on.
The only time in my life I had the possibility to give a shit was when I was working for shit money contracting for the government. I've never been in that situation again where I could spend as much time as I needed until I delivered to the quality I desired. Architecture diagrams, properly planned executions, testing etc etc. Much slower moving than startups but I trust the systems I wrote to continue saving lifes as they have done until now. Most of the code I delivered for startups, I don't even trust at release, what can I say about decades down the line...
For what it's worth, working at Amazon for a bunch of years now, this is the highest percentage of people that truly Give A Shit, I've ever encountered.
I know it's not universal but in the parts I've worked in, it's intoxicating.
My 'favorite' silly thing PMs do is UX research studies (typically on 5-10 people) and essentially ask completely untrained people if we should go with X/Y or Z. It's a super-effective way of avoiding responsibility for product decisions ("the data suggest we should go with Y"). If only building good products were as easy as asking what customers think they want.
Either they're doing the UX research wrong or (more likely) you're misunderstanding the process. You don't ask them if you should do X/Y/Z. You ask them to do X in the program, and see that none of them can find widget Y which controls it because they keep clicking on widget Z.
It's about observing the users fumble through your UX when you know their motivation.
> It's about observing the users fumble through your UX when you know their motivation.
Some time ago we did such a test. We called 10 customers to our offices and had them do some flows in the application. They didn't fumble. They pretty much did what they had to do and left positive reviews.
That whole thing got scrapped because consultants convinced our CEO that qualitative data is not good for global scoped startups, and that we should be building based on quantitative data.
Honestly, in less than a year, our customer experience was already taking a dive because all the extra little features we would add and strange UI elements, it became a confusing mess and our tracked NPS (Net Promoter Score) showed that. I've since left the company, but I check on them from time to time and they never really recovered and continue doing A | B in the hopes of hitting that sweet spot. It's just an unrecognizable monster at this point in my opinion.
Data analysis is the lowest common denominator of business thinking: the simplest, easiest thing that feels meaningful and objective. Anybody can sum up two lists of numbers in Excel and see which one is bigger.
I wish the problem were my misunderstanding the process, because then I could fix it easily by learning more about the process. I do get where you're coming from though.
I bought one of these laptops. It came with Windows 11. I followed the steps on Ubuntu website to flash a USB drive and the laptop didn't boot from it, despite selecting to boot from USB. After disabling Secure Boot from BIOS, I was able to install Ubuntu. Windows continued to work just fine.
iirc whatever cert grub was using has been blacklisted by Lenovo because of some recent security issue in grub (I can't find the details right now).
Whatever your stance on Secure Boot, this increases friction and raises the tech bar for people to install other OSes. I imagine that even a "power" user wanting to try Linux would be very confused and would probably give up after not being able to boot from USB.
It is objectively not ludicrous. It might go against the commonly taught idea that businesses should focus solely on generating profits, but it is not unreasonable to create a system where businesses have to keep the common good in mind.
There is a difference between 'how things are now' and 'how things could be'. Imagining and wanting a different status quo is not by itself ludicrous (especially since we all stand to benefit from such businesses), it's a first step towards change.
It's not just a "commonly taught idea that businesses should focus solely on generating profit", it's the fundamental principle upon which economies are built today almost anywhere in the world. Sure, there are other ways of organizing economic systems, but to suggest that we are simply or easily going to switch to one is unrealistic. Imagining and wanting a different status quo will not lead to a different status quo, especially if all we are doing is making demands on others to change their behavior and use their property in ways that we want. In other words, if we want a different status quo, we won't get it by bitching about GitHub but by building a competitor company that does things the way we want it.
> It's not just a "commonly taught idea that businesses should focus solely on generating profit", it's the fundamental principle upon which economies are built today almost anywhere in the world.
It's hardly the fundamental principle. The fundamental principle is that people need things to survive and its more efficient if people specialize and trade than if everyone creates everything they need.
The pervase idea that businesses should focus solely on generating profit is also directly responsible for lots of problems almost anywhere in the world from driving out less vicious competitors to rent seeking to externalizing costs to everyone else e.g. via pollution.
I think you're actually both right, in different ways.
Fairly self-evidently, the sane fundamental principle for a business is "make a good/provide a service, and if you do so well, you make a good profit".
Unfortunately, for the past few decades, businesses in the Western world (and particularly the US) have increasingly been operating based on a fundamental principle of "make as much money as you possibly can, and if you have to make a good/provide a service to do so, that's a necessary evil".
D is by far my favorite language. I put in a lot of effort to learn the language and thoroughly enjoyed discovering D's elegance. It has super clean solutions to all sorts of language issues (e.g. obj.foo() is just syntactic sugar for foo(obj), which gets you both type extensions and OO-looking methods on structs, which I miss in C).
That said I really tried to use D for my projects, but I had to give up for a rather surprising reason: the C interop is so good, that 1) most libraries provide a 1-1 translation of their C APIs, which ends up being ugly, non-idiomatic D that forces me to think in both C and D when coding; and 2) debuggers are not aware of D types and idioms, so when debugging, I have to again think in both C and D. Both of those add up to about 90% of the coding time, which is to say that, 90% of the time, when using D, I felt I had to code in 2 languages at the same time.
I'll skip some of the other issues I ran into, because I think a lot of the problems with D would go away if it had a large active community that would put the work in to maintain the D ecosystem, but that's a bit of a chicken and egg problem.
In the end, I decided that for me the reduced language overhead, solid ecosystem and modern conveniences of gnu17 C were more valuable in practice than the sweet features that D had to offer, and that made me a little sad, but I'm hopeful that one day D will make a strong comeback.
I know Walter gets notifications when D is mentioned on HN, and I imagine that if he read through this he'd shake his fist at me for saying interop-so-good-its-bad, but, if I could make a parallel with Java, I'd say that in code that uses many 3rd party libraries, D feels a bit like coding with JNI all the time (sorry). Ironically, in my opinion, D would benefit from having a community that rewrote popular libraries, instead of primarily relying on C interop.
Thanks Walter, that's a great example and I like that, because parens are optional, it could also be written as
e.d.c(3).b.a
which is even cleaner.
For those who are thinking UFCS is a trivial detail, consider that the shell and some other languages have pipe operators (|>) to make the code flow intuitively the same way as the data.
In my opinion, in a C-like language, managing to squeeze so much functionality out of the '.' operator without any downsides is the mark of a well thought out, elegant language.
Absolutely! Speaking of operators, out of curiosity, what's the reason for using '!' with templates?
Naively, I would think that making template instantiation look the same as a function call would be a desirable feature, with ambiguous calls needing to be resolved by the user.
Not many characters left in ASCII. Reuse of an existing operator almost required.
Binary operators cannot be use as it would be grammatical ambiguous and would need resolution at the semantic pass which is a big no-no (that's why C++ is so slow at compiling, it cannot be parsed without semantic analysis). This left only the two exlusively unary operators !, ~. As ~ was repurposed for string concatenation, only ! remained.
templ!thing(a,b)
I would have thought that templ(thing)(a,b) would have been a good solution, as it is what is used in the declaration/definition side of templates, but this would have made removing redundant () not possible in UFCS expressions.
is it a function?, or are you calling thing's function?
you can do:
templ!(thing)(a, b)
but did you mean?:
templ!(thing(a, b))()
i personally always use !(), no matter what, and it's annoying to type, i don't want to waste time constantly trying to figure out what is what, it's mentally draining
kotlin became very useful for focusing on being able to consume Java code
it allowed them to have a huge presence on android, that's enabler
it profits Zig as well
not everything needs to be ranked #1 in TIOBE index
there is value in being the way it is, it's organic, and no companies get to control its faith
> the C interop is so good, that 1) most libraries provide a 1-1 translation of their C APIs, which ends up being ugly, non-idiomatic D that forces me to think in both C and D when coding;
what do you mean? it's the same, function and data
struct Data {}
do_this(&myData);
this is valid D, it's also valid C
the problem i think you have is you are abusing OOP and think it's the only way of doing things, which is wrong, and this explain the sad state of software nowadays ;)
but it's weird when you then say you decided to stick with C, you contradict with yourself
It's great because of how accessible it makes compilers to average programmers, and, as a bonus, it also throws shade at the overly dense compiler textbooks:
"Classic compiler books read like fawning hagiographies of these heroes and their tools. The cover of Compilers: Principles, Techniques, and Tools literally has a dragon labeled “complexity of compiler design” being slain by a knight bearing a sword and shield branded “LALR parser generator” and “syntax directed translation”. They laid it on thick." (chapter 6)
Having read Compilers for a compilers course in college, Crafting Interpreters was a fun read on many levels.