The fact it's that for a normal usage, Firefox with uBlock Origin it's faster that Chrome without ad blocking. On Android this is especially noticeable.
Well... The rusian Spectrum clones had some sucesfull career. And they did a lot of improvements over the original Sinclair and Amstrad designs. The Pentagon and the Scorpion with extra RAM, or the ATM come from pure rusian ingenuity .
I agree with you. Also, there is people (like me) that like to small commits (that don't break stuf) instead of huge mega commits. If I do something like small broken/wip commits, are only under my working bramch and I do a interactive rebase to merge on good cohesive commits.
The past was bad. But the current is far worse. Tell it to the people disappeared in the ICE concentration camps. Or to any trans people in any bad state.
What does "GPU" mean here? Previous uses of the term seemed to imply "dedicated hardware for improving rendering performance" which the SVGA stuff would seem to fall squarely under.
The term GPU was first coined by Sony for the PlayStation with its 3D capabilities, and has been associated with 3D rendering since. In some products it stood for Geometry Processing Unit, again referring to 3D. Purely 2D graphics coprocessors generally don’t fall under what is considered a GPU.
It has been associated with 3D rendering, but given that things like the S3 86C911 are listed on the Wikipedia GPU page, saying "Accelerated GUIs don't need GPU" feels like attempting to win an argument by insisting on a term definition that is significantly divergent from standard vulgar usage [1], which doesn't provide any insight to the problem originally being discussed.
[1] Maybe I've just been blindly ignorant for 30 years, but as far as I could tell, 'GPU' seemed to emerge as a more Huffman-efficient encoding for the same thing we were calling a 'video card'
I don’t agree with what you state as the vulgar usage. “Graphics card” was the standard term a long time, even after they generally carried a (3D) GPU. Maybe up to around 2010 or so? There was no time when you had 2D-only graphics cards being called GPUs, and you didn’t consciously buy a discrete GPU if you weren’t interested in (3D) games or similar applications.
In the context of the discussion, the point is that you don’t need high-powered graphics hardware to achieve a fast GUI for most types of applications that WPF would be used for. WPF being slow was due to architectural or implementation choices.
Most people consider GPU to mean "3D accelerator" though technically it refers to any coprocessor that can do work "for" the main system at the same time.
GPU-accelerated GUI usually refers to using the texture mapping capabilities of a 3D accelerator for "2D" GUI work.
I remember these disk from my Spectrum +3 . Indeed more hard and resistant that the 3.5" . Sad, that the format was on the losing side and never evolved beyond the 128k (or was 256k?) that could store on a single side.
reply