Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you know a user is on version 3 and need to update to version 5, then why not just send out all the patches between 3 and 5? Why do you need to generate a new patch for each pair of versions.

It feels a bit egregious when I have to download a 100MB update just because a few characters were buffed or nerfed. More involved changes end up being over 1GB.



Because it's not just version 3 to version 5, it's version 3 to version 84.

Not all versions are made equal either - one might be a character buff, another might reorder assets in the "big huge binary blob file" for performance improvements. At a certain point, rather than downloading 30MB per update for 25 versions, and applying each incrementally (remember that you have to do them in order too), just download the full 1GB once and overwrite the whole whing.


Microsoft made sure in windows 10 that it's almost unusable without SSD. SO you big binary blob file have random r/w access.

Most backup software is able to do good binary deltas of arbitrary data for decades. Even dumb checkpointing resolves problem of downloading 25 versions - you download latest checkpoint and deltas from there.

Don't excuse poor design and programming, when you know a file structure, creating a differential update should be short task. With a tiny bit of algorithmic knowledge you could even optimize the process to only download needed assets inside of you big binary blob - if the asset was changed 7 times during your last 25 version you only need to download the last one.


I'd personally like to see a company put a little thought into innovating how they store data on disk so patches can be quickly applied like with git while also not requiring a full source recompilation.


It can go worse - some cheap and badly designed Android phones which download updates from every month when you first buy it until the current month, so maybe 10+ updates, but they aren't deltas (diffs) but full images. Ridiculous on so many levels.


It’s because they only tested updates from one version to the next, and not every version to every newer version.

It is a complete image, but phones today have nontrivial state that may be a problem - e.g. your baseband processor might have its own rom with its own update protocol, which changed between image 2 and image 7, so image 10 after image 1 will be unable to update the baseband.


If it's a cheap phone, I'd rather them do something brute force but reliable than try and be clever when they know they don't have the budget to QA it.

I honestly consider that a pretty reasonable trade-off.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: