Hacker Newsnew | past | comments | ask | show | jobs | submit | danburzo's commentslogin

Keep in mind this shows the “live most polluted major city ranking, 11:00–12:00” (EEST time), so rather short-term measurements.


As many have pointed out here, the nature of caching has changed in the current climate of ubiquitous HTTPS, and I want to add a paragraph or two about it. Is there a good summary somewhere that I could reference? What are the the usual, most prevalent uses of HTTP intermediaries involving caches, besides CDNs and origin-controlled caches (eg Varnish)?


HN is full of noobs loudly proclaiming what they don't know is true these days. Ubiquitous HTTPS does not change the nature of private browser caches, and only nullify the proxy related cache headers if the origin encrypts traffic all the way to the client, which is quite rare in real life, unless we are merely talking about a dude serving this blog from his basement computer.

In general, your answer depends on where the TLS cert terminates. In most situation a CDN or a reverse proxy is involved, and the TLC cert you use to encrypt traffic from the origin to the proxy is different from the one the proxy uses to encrypt traffic from it to the browser. Whenever a MITM intermediary is involved, you should read the intermediary's documentation. These usually include Cloudflare, AWS Cloudfront, Akamai etc. With with exceptions, like the Vary header as pointed out elsewhere, these vendors largely follow HTTP caching semantics for proxy caches.


Thanks! I’ve updated the introduction with some ‘now vs then’ pointers.


Good call! Honestly I just wanted to wrap it up before the holidays, but you’re right that a small section on Vary would have been useful.

Things like non-conforming caching services made me punt actual suggestions to a later article, as I wasn’t sure how my sense of the RFC interacted with the real world. HTTP Caching Tests seems like a great resource for this, but only includes Fastly out of the big providers, and it seems to be doing okay with Vary. https://cache-tests.fyi/


Updated the article with some information on the `Vary` and `No-Vary-Search` headers. I’ve left out the details of how revalidation works with `Vary` since I haven’t been able to reconcile yet what the spec seems to encourage vs what the tests on cache-tests.fyi suggest is conformant behavior.


I’m sorry you didn’t get anything out of it. I wasn’t operating at the edge of caching knowledge, just a person refreshing and clarifying for themselves how caching works. Some things were new to me, and after spending so much time with the RFC, I just thought others may benefit or, more selfishly, would point out errors or ways to make it better.

I mean, do those <meta> tags really suggest someone who’s into SEO? Call me stale but what I really want is validation :-)


You could probably just ask Sasha Shor:

“As one of the creative leads on the id software brand team at Pyro in Texas, I worked on the logo, font, packaging and advertising, as well as the global E3 launches, for Quake, Quake 2, 3 and 4, some of the most iconic video game launches in the history of gaming.” http://www.sashashor.com/new-page


I don’t agree with the assertion made in the article but if they did use a font without securing permission I somehow doubt Shor would want to admit that, so sometimes you do need something other than a first party source.


>> ...but if they did use a font without securing permission...

IANAL but I believe you can trace even a commercial font and use it. Making a bitmap font from print and using it in a game should be fine.


There was a Quake 4?! ...ah yes... no.. heh?

Must've blotted it out to preserve the affection I had for the first 3.


It was a sequel to Quake II by Raven Software, using id Tech 4 (DOOM 3), notable for having a first-person cutscene in which you are "stroggified", transformed into one of the mechanical zombie soldiers you've been fighting against all game (but your NPC teammates save you at the last second before the brain implant that removes your free will is implanted). Apparently at the time, this was talked about lots in the marketing leading up to release, but when I played it as a kid, I never knew anything about that, so it was a real shock when I got to that part.

One cool thing the game did was they used the DOOM 3 "interactive panels" tech to make not only English-language human-manufactured "touchscreens", but also Strogg-language alien-manufactured "touchscreens", that you had to interact with to open doors and so forth. After becoming "stroggified", the glyphs on the Strogg touchscreens shift and you can now read them in English.

I went back and replayed it a few years ago and it's really pretty generic as far as shooters of that era go, but I thought Raven did a decent job given what they had to work with.


I used to sneak into my older sister's room to play it on her PC when I was a kid. I remember the stage with the toxic chemical facility or whatever it was, where the dead bodies would rise up behind you after you passed them by. I was too scared to play past it after getting jumpscared several times from behind, lol.


I believe I probably started playing it when it came out, but abandoned it pretty early on, hence my initial confusion - like something you half remember.

The first 3 games though were great in their different ways, but Quake 1 will always be my favourite.


It was 'ok', it however, gave us the excellent ET:QW which was unbelievably good fun.


This is exactly my jam!

May I suggest adding a distinct style for visited links, so it’s easier to keep track?


Great idea, and done! :-)


With the goal of showing nice, clear images to as many devices as possible, while optimizing the file size, I would:

1. decide if I want to use any of the newer image formats. If so, each needs its own `<source type=''>` in a `<picture>` element, front-loading the most efficient formats. 2. decide if I want to serve different densities for the image.

For specifying densities, width descriptors + `sizes` attribute will always compute to a more useful effective density than density descriptors, if you can get `sizes` in the ballpark of how the image is actually laid out.

For lazily-loaded images, `sizes=auto` will do that for you, when it becomes universally supported.


Thank you for the kind words! I have fun reading specs, but the HTML Standard is denser and organized more… cross-referentially than the average CSS spec, so there was a bit of putting-things-together.


Thank you for the kind words!


I haven’t explicitly mentioned it in the article, but `srcset` + `sizes` is a way to provide dynamic densities for one image format, then multiply that with one `<source>` for each image format:

    <picture>
      <source srcset='…' sizes='…' type='image/avif'>
      <source srcset='…' sizes='…' type='image/webp'>
      <img srcset='…' sizes='…'>
    </picture>

If this is what you mean, maybe it would be better to include in the article?


Ah, perfect. For some reason I didn't realise that source has the same attributes as img.

If it's not too much work, I think it would be good to add to the article for completeness :)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: