[TransWarp] Call for requirements: Validation and Constraints

Phillip J. Eby pje at telecommunity.com
Wed Jan 1 22:21:49 EST 2003


I'm trying to design a framework for validation of structural features of 
classes based on 'peak.model' metaclasses.  The idea is that it should 
allow checking of constraints upon data values.

For simplistic constraints like "attribute X must be between 1 and 50", 
this is trivial to implement.  Features could have a simple 'validate()' 
method that could be overridden in a model's definition to do the 
checking.  Even slightly more complex constraints like "attribute X must be 
less than attribute Y" can be checked in this way, by checking assignment 
to both X and Y.

However, contstraints may also need to be "deferred".  That is, if 
attribute X must be less than attribute Y, it may be impossible to maintain 
this invariant while changing both X and Y.  So a "deferred" constraint 
should be checked, for example, when saving an object to persistent 
storage, or upon completing construction of an immutable object.  Deferred 
constraints are a pain from a debugging point of view, however, because 
they decouple the error from the point in the code where the problem was 
created.

The above notions also don't address features that are collections or 
associations.  Any constraint on an association that depends on data at the 
other end, implies that changes to the features of the "other end" objects 
must be checked as well.  That is, if a constraint says that all members of 
the "childWindows" association must have bounding boxes which are contained 
within the bounding box of the parent, then changing the bounding box of 
any window must verify the change against the inverse "parentWindow" 
relationship.  (Assuming of course that there is such an inverse 
relationship, and that these window objects aren't immutable.)

There's even another potential phase of "deferred validation", wherein 
objects explicitly know about what's wrong with them, and simply have 
functionality limitations until the problem is resolved.  *But* the objects 
can still be saved and loaded from persistent storage.  An example would be 
a workflow design tool which validated the workflow definition in some 
fashion and wouldn't let you run an invalid design, but would still let you 
save it so you could continue work on it later.

One thing appears clear: implementing fully general constraint validation 
isn't simple.  What isn't clear, however, is how much *less* than fully 
general constraint validation we can get away with.  I would appreciate any 
input that anyone has to offer on this issue.

For the time being I'm assuming that hooks like the following are a minimum 
requirement for peak.model, with the caveat that the details might change 
by implementation time:

feature._validate_immediate(element,value) -- raise an exception if 'value' 
is not a valid target of assignment to feature, else return.

feature._validate_member_immediate(element,item) -- raise an exception if 
'item' is not acceptable as a target of the reference or collection 'feature'.

classifier._validate_deferred() -- raise an exception if the state of 
'classifier' is not valid.

One thing I'm not clear on is whether these methods should really raise 
exceptions, or perhaps take some sort of visitor that accepts validation 
info.  While this approach is more complex in some ways, the default 
visitor could simply raise a validation error based on the 
information.  The advantage would be for the scenarios where you want 
validation data, but you don't want the object to break or be unable to 
save it when it's invalid.  It would also be handy when you'd like to 
possibly get multiple pieces of error information and consolidate them for 
display.




More information about the PEAK mailing list