[TransWarp] Call for requirements: Validation and Constraints

Roché Compaan roche at upfrontsystems.co.za
Thu Jan 2 02:07:15 EST 2003

On Wed, 01 Jan 2003 22:21:49 -0500
"Phillip J. Eby" <pje at telecommunity.com> wrote:

> I'm trying to design a framework for validation of structural features of 
> classes based on 'peak.model' metaclasses.  The idea is that it should 
> allow checking of constraints upon data values.
> For simplistic constraints like "attribute X must be between 1 and 50", 
> this is trivial to implement.  Features could have a simple 'validate()' 
> method that could be overridden in a model's definition to do the 
> checking.  Even slightly more complex constraints like "attribute X must be 
> less than attribute Y" can be checked in this way, by checking assignment 
> to both X and Y.
> However, contstraints may also need to be "deferred".  That is, if 
> attribute X must be less than attribute Y, it may be impossible to maintain 
> this invariant while changing both X and Y.  So a "deferred" constraint 
> should be checked, for example, when saving an object to persistent 
> storage, or upon completing construction of an immutable object.  Deferred 
> constraints are a pain from a debugging point of view, however, because 
> they decouple the error from the point in the code where the problem was 
> created.
> The above notions also don't address features that are collections or 
> associations.  Any constraint on an association that depends on data at the 
> other end, implies that changes to the features of the "other end" objects 
> must be checked as well.  That is, if a constraint says that all members of 
> the "childWindows" association must have bounding boxes which are contained 
> within the bounding box of the parent, then changing the bounding box of 
> any window must verify the change against the inverse "parentWindow" 
> relationship.  (Assuming of course that there is such an inverse 
> relationship, and that these window objects aren't immutable.)
> There's even another potential phase of "deferred validation", wherein 
> objects explicitly know about what's wrong with them, and simply have 
> functionality limitations until the problem is resolved.  *But* the objects 
> can still be saved and loaded from persistent storage.  An example would be 
> a workflow design tool which validated the workflow definition in some 
> fashion and wouldn't let you run an invalid design, but would still let you 
> save it so you could continue work on it later.
> One thing appears clear: implementing fully general constraint validation 
> isn't simple.  What isn't clear, however, is how much *less* than fully 
> general constraint validation we can get away with.  I would appreciate any 
> input that anyone has to offer on this issue.

The idea of "deferred validation" is new to me but I feel one can get
away without out. It sounds very complicated to implement and that it
can really lead to debugging hell.

If one doesn't have deferred validation I think one can still achieve
the same goal by working on some sort of proxy or copy of the real
object until one needs to update the real object. This is almost like
working on copy of the source in a cvs checkout. As soon as one is ready
to check the code back in you might receive warnings and need to handle
them before checkin is possible.

I do however feel that constraints (not deferred) on collections are
needed. There are many use cases that show its importance:

    The contacts of customer X should have the "Member" role if the
    customer has purchased a license.

    Components X,Y and Z of Product A should always be enabled.

I think one can come up with a book of examples like the above.

> For the time being I'm assuming that hooks like the following are a minimum 
> requirement for peak.model, with the caveat that the details might change 
> by implementation time:
> feature._validate_immediate(element,value) -- raise an exception if 'value' 
> is not a valid target of assignment to feature, else return.
> feature._validate_member_immediate(element,item) -- raise an exception if 
> 'item' is not acceptable as a target of the reference or collection 'feature'.
> classifier._validate_deferred() -- raise an exception if the state of 
> 'classifier' is not valid.
> One thing I'm not clear on is whether these methods should really raise 
> exceptions, or perhaps take some sort of visitor that accepts validation 
> info.  While this approach is more complex in some ways, the default 
> visitor could simply raise a validation error based on the 
> information.  The advantage would be for the scenarios where you want 
> validation data, but you don't want the object to break or be unable to 
> save it when it's invalid.  It would also be handy when you'd like to 
> possibly get multiple pieces of error information and consolidate them for 
> display.

It "feels" right that validation methods should raise an exception. One
can still catch the exceptions and consolidate the error information for
display. It is then very explicit where validation failed and the
developer is forced to be aware of it and handle it. If one uses the
visitor pattern it seems possible that an exception might never be
raised which can prove fatal for object state?

Roché Compaan
Upfront Systems                 http://www.upfrontsystems.co.za

More information about the PEAK mailing list