Audiophile-grade self-delusion is, fortunately, fairly rare. What's far more common today is the error many consumers make about 4K TV. I have friends who show me their new 4K TV and think it looks "amazing" just because they are feeding it a source labeled "4K" resolution, without considering (or even knowing) the bit rate of the compressed source.
While there are good 4K sources like UHD Bluray discs and a few decent streaming sources (but it's spotty and inconsistent), the vast majority of "4K" sources look pretty awful due to being wildly over-compressed. And the level of self-delusion from some of my friends – who are otherwise very smart – is near-emperor's-new-clothes level.
A recent example was the Super Bowl which was one of the few "special events" Comcast aired in 4K (only available on the latest version of Comcast's streaming boxes which few people have – and 4K is not available if you use DVR). I checked it out on a current-year higher-end 65" 4K TV (~$3k) which I adjusted myself to disable the myriad default settings claimed to "enhance" signals but often mangle them in unexpected ways.
Even set up properly on a good TV it looked terrible. The tell-tale DCT macro-blocking was rampant and in many shots the base 4K-ness of the resolution space seemed to highlight the 'peggishness' worse than the 1080p version (which I was cross-checking against out of now-morbid curiosity). The super bowl is widely claimed to be the best quality 4K live broadcast. $250,000 lenses on the best cameras sent through the best signal paths managed by hundreds of engineers across a small city of the world's best production trucks. All stomped into a peggish mess so that cable and satellite companies can shove hundreds of channels of infomercials and decades-old Murder She Wrote episodes down the pipe simultaneously. I guess they'll keep doing it until enough people demand better or someone offers an alternative. Afterward I searched online and it appears the only widely accessible way to view the SB in decent 4K was if you live in one of the few cities that currently have broadcast TV stations airing true 4K and you set up an over-the-air antenna. While it's still compressed, due to FCC specifications for signals actually broadcast OTA, it's pretty mild and apparently looks sensational compared the what most people saw at home on their "awesome" new 4K TVs.
Oh, this is something I've been wondering about - I've had a 4K TV for years, but I don't think I've ever seen a 4K Bluray (I don't think I even know anyone who had a Bluray player).
Are you saying that streaming a movie (say a recent Marvel one) in 4K from Disney Plus will look noticeably less good than if I played the same movie from a Bluray player?
The video file for that movie stored on a UHD Blu-Ray would probably be in the neighborhood of 50-90GB, depending on various factors such as movie length, disc capacity (I think UHD discs come in two different capacities), compression rate used, and the inherent compressibility of the movie (stuff with a lot of grain, for example, will not compress as well).
A streaming service would probably compress that same 4K movie down to around 10-15GB tops, but probably even smaller, again depending on the same factors. In my opinion most people probably wouldn't notice a huge difference, if any, but it would definitely be noticeable if you knew what to look for (loss of fine detail in fast-moving scenes with a lot going on, a smooth gradient in a low-light scene on the blu-ray might have noticeable banding on the streamed version, and other compression artifacts like that).
Blu-ray torrents could fall anywhere on the spectrum depending on how it was compressed by whoever is distributing the torrent. If you see one labeled "remux" that usually refers to a video file taken directly from the Blu-ray without any additional compression, so watching that one would give you the same experience as watching the blu-ray directly.
A 4K Blu-ray will look significantly better than streaming in 4K. Streaming has an advantage that (sometimes, on newer hardware) it can use newer / better video codecs than DVD/Blu-ray/4k Blu-ray, which are frozen in time at a specific standard for decoding. However, Blu-rays can store 100GB of data. The massive increase in bitrate more than makes up the difference.
When downloading video, you can grab a remux (the raw video files off of the disk, just stripped of DRM), or a re-encode (those video files compressed again using a better codec). A re-encode will be about the same size as a streaming release and generally has higher quality. Re-encodes tend to use cutting-edge video compression, and pirates are willing to throw more compute time at a single output format than a streaming service which has to produce video files for many target devices.
A 4K Blu-ray (or remux) will get you the best quality, followed by re-encodes, and then web-dl's.
For a more recent example, I found the desert/spice scenes in Dune to be unwatchable on HBO Max but very pleasant in my subsequent theater visit and later on the 4K disc. In my experience Netflix is the worst offender of the major streaming platforms, and iTunes/Apple TV+ can approach some Blu-rays (though there’s no trick here: they send more data down the pipe).
If you have a decent home theater setup it makes a very noticeable difference in both picture and sound. Good bluray and 4k bluray releases often have lossless audio codecs, and currently no streaming service has this. Although, sound differences might be more due to mixing differences. Streams tend to mix in a way that prioritizes the 5.1 sound still sounding good to people using the lowest common denominator setup.
But also the bitrate alone should tell you something. You will typically get around 15-20 Mbps for 4k streaming, and often lower, though Apple's service apparently will occasionally hit up to 40 Mbps. 4k discs have a bitrate of around 128 Mbps.
4K TV signals invariably suck, absolutely. But depending on people's preferences, they may think the vibrant colors and improved sharpness of even a mediocre 4k HDR stream looks "better" than a higher-bitrate 1080 SDR video. Personally, I'm a bit of a purist so I prefer SDR blu-ray unless the source material was filmed with HDR in mind, but I totally understand why others prefer the "pop" of a 4k remaster streamed over Netflix. It's all different preferences.
> Audiophile-grade self-delusion is, fortunately, fairly rare.
Oh not from where I see the world! hehe - infact I see most stem educated people of reasonably good IQ by and large living a mythology they do not even know they are living in because they've never really understood the difference between knowledge and understanding and the whole purpose of the humanities.
Mythology is the stories we tell about how we as individuals relate to society and the cosmos.
Elon Musk functions as a sort of deva or heroic god myth that many people do not even realise they are buying into.
Mythology even includes how we structure science - the "genius", the archetypes and roles and stories we tell about heros and how scientific activity takes place - these are myths that have causal effects in producing actual science. Mythology is meta to everything. Its literally the meaning about who we are and our place in the world. There's many cases. Pick your mythology wisely because if you don't you'll get a poor version of one by cultural default.
I can tell you a lot on Hacker News does not look too much different from the audiophile community from this perspective hehe
As long as it looks better than 1080 I'm happy with it. Are you saying I shouldn't watch or can't enjoy 4K unless it's perfect? Sounds like you're the one suffering from self-delusion.
While there are good 4K sources like UHD Bluray discs and a few decent streaming sources (but it's spotty and inconsistent), the vast majority of "4K" sources look pretty awful due to being wildly over-compressed. And the level of self-delusion from some of my friends – who are otherwise very smart – is near-emperor's-new-clothes level.
A recent example was the Super Bowl which was one of the few "special events" Comcast aired in 4K (only available on the latest version of Comcast's streaming boxes which few people have – and 4K is not available if you use DVR). I checked it out on a current-year higher-end 65" 4K TV (~$3k) which I adjusted myself to disable the myriad default settings claimed to "enhance" signals but often mangle them in unexpected ways.
Even set up properly on a good TV it looked terrible. The tell-tale DCT macro-blocking was rampant and in many shots the base 4K-ness of the resolution space seemed to highlight the 'peggishness' worse than the 1080p version (which I was cross-checking against out of now-morbid curiosity). The super bowl is widely claimed to be the best quality 4K live broadcast. $250,000 lenses on the best cameras sent through the best signal paths managed by hundreds of engineers across a small city of the world's best production trucks. All stomped into a peggish mess so that cable and satellite companies can shove hundreds of channels of infomercials and decades-old Murder She Wrote episodes down the pipe simultaneously. I guess they'll keep doing it until enough people demand better or someone offers an alternative. Afterward I searched online and it appears the only widely accessible way to view the SB in decent 4K was if you live in one of the few cities that currently have broadcast TV stations airing true 4K and you set up an over-the-air antenna. While it's still compressed, due to FCC specifications for signals actually broadcast OTA, it's pretty mild and apparently looks sensational compared the what most people saw at home on their "awesome" new 4K TVs.