You should read about dynamic arrays more carefully. They have amortized O(1) insertion which is better than a tree, and the data is contiguous in memory which gives it better cache locality than a tree. They are one of the most popular data structures.
Parts of your post also seem to me to be quite boastful and low-value: (paraphrasing) "the course takes itself very seriously", "why spend so much time teaching so little material", "these topics are mostly old; just read Knuth", and "dynamic programming is easy; I learned it in 90 seconds and then did my PhD in it".
I had a revision of that post, but I had
it ready only just after the end of the 2
hour window for revisions.
The course pressed hard on the students to
devote, what was it, 4 hours of time a
week in group sessions with more hours
in independent study. That's asking a lot
from the students.
In response I had a lesson and purpose
in that post: (A) That collection of
fundamental algorithms hasn't changed very
fast, were much the same 40 years ago.
(B) Nearly all the algorithms are quite
simple and each one can be learned
quickly, including the derivation of its
big-O performance, and coded, running, and
tested in 2 hours or so. (C) I mentioned
Knuth v.3 as a reference: Tough to
improve on that Knuth volume as a
reference for such algorithms. (D) For
hashing, network flows (graph search),
and dynamic programming I gave really good
references -- tough to compete with any of
them. I used some of my experience to
illustrate (A) -- (D).
That lesson should be some quite good news
for any HN readers considering studying
the algorithms.
> Parts of your post also seem to me to be
quite boastful and low-value:
No, I just used some of my experience to
give examples of my points.
> You should read about dynamic arrays
more carefully. They have amortized O(1)
insertion ....
I saw all that. Get the O(1) property
only with some assumptions and some math
derivations, and I mentioned the math.
Two obvious problems:
(1) It is trivial for any of us to think
of applications where the new allocations
and copying would be wildly wasteful.
(2) For the assumptions, we will rarely
have enough information to have much
confidence in our math derivation.
We also can think of
(3) The reallocations, when there are a
lot of them, will create problems for the
memory management, garbage collection.
Sure, any of us can think of niche
situations where (a) we do a few
reallocations and (b) then go for hours,
..., months with no more reallocations and
with the advantages of arrays.
Dynamic arrays don't belong on a list of
Best Algorithms.
Again, my guess is that the interest of
the course in dynamic arrays is an
opportunity to do the math derivations.
The people that MIT course is intended for
are maybe good HN readers, so an issue for
HN readers is, should they devote a lot of
time to that course? We review movies,
restaurants, etc., and we might also
review courses.
Your attack on my review was mostly an
attack on me: You resented my post
because I mentioned some of my background
and in response attacked me. Instead,
make a good contribution of your own,
maybe as a review of that MIT course.
I'll state the basic lesson again:
The algorithms in that course are nearly
all quite good but old, with some really
good old references, and can be learned
quickly, say, the whole course in a few
weekends.
Thanks for taking the time to read and respond. I admit the second paragraph of my post was a bit aggressive and I was on the fence about posting it. I don't have a problem with you sharing your background but the parts I mentioned previously came off in a certain way to me.
I found your initial argument on dynamic arrays dismissive because you admitted you had never heard of it, then implied that they don't make sense as if to justify why you had never heard of them. I find that intellectually dishonest and it really ticked me off; it's just confirming one's own bias. I still find your argument a bit dismissive although we can agree to disagree. It's not a case of worthiness to be on a list of best algorithms or of fancy math derivations. They are widely used in practice, are O(1) for many operations, work well with caches, and are worth studying for that reason.
As for making a "good contribution of my own" by reviewing the course, I don't feel the need. It's a standard undergrad algorithms course of the kind that most CS students would take. I don't think there's any value in reviewing the syllabus when they all tend to cover the same material.
I'm probably won't reply again so (sincerely) have a good day. I realize you feel attacked but if you're going to opine on something then other people might opine on your opinion. You don't hesitate in your writing style so I didn't either. I just apologize if I made it too personal. I read some things that I couldn't let slide.
Parts of your post also seem to me to be quite boastful and low-value: (paraphrasing) "the course takes itself very seriously", "why spend so much time teaching so little material", "these topics are mostly old; just read Knuth", and "dynamic programming is easy; I learned it in 90 seconds and then did my PhD in it".