Yeah I hear you. I've still never found a good way to implement schema validation rules like "this list cannot have more than 10 items" on top of a CRDT. Eventually consistent collaborative editing systems (like CRDTs and OT based systems) are optimistic. They allow any edits, and then later merge things if necessary. But if you only find out the list has 11 elements at the merging step, what do you do? Its too late to reject either of the inserts which put you in that state.
My best answer at the moment is to tell people to rethink validation rules like that. I can't think of a lot of good use cases for collaborative editing where a "length <= 10" rule is something you want.
Unfortunately, validation rules are really important for referential integrity. If you add a reference to some item in a data set and I concurrently delete that item, what should happen? Does the delete get undone? Does the reference get removed? Is the reference just invalid now? Should references only be to an item at some point in time? (Eg like a git SHA rather than a path)? Maybe optimistic systems just can't have referential integrity? Its an uncomfortable problem.
My best answer at the moment is to tell people to rethink validation rules like that. I can't think of a lot of good use cases for collaborative editing where a "length <= 10" rule is something you want.
Unfortunately, validation rules are really important for referential integrity. If you add a reference to some item in a data set and I concurrently delete that item, what should happen? Does the delete get undone? Does the reference get removed? Is the reference just invalid now? Should references only be to an item at some point in time? (Eg like a git SHA rather than a path)? Maybe optimistic systems just can't have referential integrity? Its an uncomfortable problem.