Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Pointless superficial review, standard for Wired, Verge etc - no brightness uniformity check, no color uniformity check, no color accuracy check, no coating grain check.

Meanwhile a good number of reports mention terrible uniformity issues with that model.



This publications are entertainment pretending (sometimes) they are more than that.


Regurgitated press releases.


when I want display reviews I go to rtings.com

haven't found anyone who compares


Rtings publishes charts in abundance, but the subjective quality of a monitor is more important. For example, a chart will tell you a monitor has low color deviation from sRGB after calibration, but won't tell you that the monitor UI takes 10 laggy clicks to switch from sRGB to DCI-P3 and will reset your selection every time you toggle HDR mode.

I admire Rtings' attempts to add more and more graphs to quantify everything from VRR flicker to raised black levels. They were helpful when I last shopped for a monitor. But the most valuable information came from veteran monitor review sites such as Monitors Unboxed and TFTCentral.


Agreed; rtings has by far the best reviews and comparisons, and detailed tests nobody else seems to do.

I originally found them because they were one of the only sources that tested for PWM flicker in monitors.


Rtings good at measures, but bad at conclusions. Take their numbers and make your own opinion. Prad.de is better at that respect.


They do a good job of supporting whatever comparisons you want to make, which is useful if you have different preferences; however, I do think they have clear conclusions in many cases, to capture strengths, weaknesses, and "best of" recommendations.


Their, rtings.com "conclusions", "scores" and "recommendations" do not always correspond to their own numbers.


Exactly. Their ten-point scales have no obvious relationship to the underlying measurements, where measurements are even provided, and they rescale the points every so often. (They call this their "methodology version.")

I've noticed that, when a new (expensive, high-commission-generating) product comes out, it often has middling scores at first, and then, a few months later, they've revised their methodology to show how much better the pricey new product is.


1) I trust rtings to not change their position on the basis of what makes them money; that trust is their whole brand.

2) I have not seen products jump from middling to high before, but I have seen the scores change with new methodologies, and sometimes that has the net effect of lowering the scores of older devices. Typically, that seems to represent some change in technology, or in what people are looking for in the market. For instance, I would expect (have not checked) that with the substantially increased interest in high-refresh-rate monitors, the "gaming" score has probably changed to scale with gamer expectations of higher frame rates. That would have the net effect of lowering the score of what was previously considered a "good" monitor. This seems like an inherent property of improving technologies: last year's "great" can be this year's "good" and next year's "meh".

Personally, I never pay much attention to the 0-10 scores in the first place, and always just make tables of the features and measurements I care about. The only exception is for underlying measurements that are complex and need summarizing (e.g. "Audio Reproduction Accuracy").


another good one they have is keyboard reviews. They do good work.


Notebookcheck and the German C'T magazine also do really decent jobs.


I've just bought one. The color uniformity is terrible across the screen. It's tolerable to me, but not what I'd expect at the price. Everything else is quite fantastic though.


They gave a color accuracy number, was there something wrong with it?


It needs to be independently measured, not just retold from the spec.


"I measured the average color error"

"I was able to measure 640 nits"

I don't see the problem.


Ok fine yes, I missed it, but average Delta-E measured without specifying what mode it was measured in, what is the maximum Delta-E (not average - I've used monitors with average dE below one with massive greenish/pinkish color tints, as maximum dE on some colors well exceeded 4), what was deltaE away from the center, what was gamma and colorspace coverage error, is pointless. Check prad.de, they do it right way.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: