Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Well this is patently false. For the past 3 decades, programmers have intentionally made choices which perform as poorly as the hardware will allow them. You can pretty much draw a parallel line with hardware advancement and the bloating of software.

It hasn't gotten 100x harder to display hypermedia than it was 20 years ago. Yet applications use 10x-100x more memory and CPU than they used to. That's not good software, that's lazy software.

I just loaded "aol.com" in Firefox private browsing. It transferred 25MB, the tab is using 307MB of RAM, and the javascript console shows about 100 errors. Back when I actually used AOL, that'd be nearly 10x more RAM than my system had, and would be one of the largest applications on my machine. Aside from the one video, the entire page is just formatted text and image thumbnails.



> You can pretty much draw a parallel line with hardware advancement and the bloating of software.

I do not think it is surprising that there is a Jevons paradox-like phenomena with computer memory and like other instances of it, it does not necessarily follow that this must be a result of a corresponding decline in resource usage efficiency.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: