[TransWarp] Constraints on model attributes

Phillip J. Eby pje at telecommunity.com
Mon Jul 28 09:57:40 EDT 2003


At 08:27 AM 7/28/03 +0200, Roché Compaan wrote:
>Except for ensuring that all model.Attributes have title, description
>and readonly attributes and provide a hook for constraints. It is clear
>to me now that these attributes can be put in a separate feature base
>and types should stay types.

Yep.


>Don't you want to give some examples of how syntax is used for
>conversion. What is happening here:
>
>         syntax = feature.syntax
>
>         if syntax is None:
>             syntax = getattr(feature.typeObject, 'mdl_syntax', None)
>
>         if syntax is None:
>             syntax = fmtparse.Conversion(
>                 converter = feature.fromString,
>                 formatter = feature.toString,
>                 defaultValue = feature._defaultValue,
>                 canBeEmpty = feature.canBeEmpty
>             )

If you grep the PEAK source for uses of 'URL', you'll find lots of places 
where syntaxes are defined and used.  The peak.util.fmtparse package 
defines rule-driven syntax objects like Sequence, Alternatives, 
MatchString, ExtractString, Tuple, Repeat, and so on.

The code above generates a default syntax for a feature in the case where 
you have not explicitly defined a syntax for it.  If the type has a 
'mdl_syntax', that's what's used, otherwise a simple Conversion syntax rule 
is created, as shown.

The syntax for a feature isn't intended to be used indepenently; rather, 
it's used to construct the syntax for a type.  If you look at how the 
various URL syntaxes are created, you'll notice that the overall URL syntax 
is usually created by stringing the features together as part of a syntax 
definiton for the URL type as a whole.


> > but I think I'd rather focus attention on *object*-level constraints,
> > and perhaps an object-level validation.
> >
> > Indeed, I wonder if maybe peak.model should offer an object-level
> > validation hook, that gets called before an object's state can be flushed
> > to storage.  If this hook were also callable by UI frameworks, they could
> > simply set all values as desired, call the validation hook to get a 
> list of
> > problems, and roll back the transaction.
>
>In addition to validation hooks on structural features?

If you're referring to the existing hooks, yes, I do intend to keep 
them.  But those are really *normalization* hooks, to my mind, because they 
are based on the type of the feature, rather than being specific to the 
feature.  I also see little reason to have validation hooks on individual 
features, because it will just add a lot of function call overhead, to no 
good reason.  I'd rather see something like this at the object level:

     def mdl_validate(klass, obj, errors=None):
         if errors is None: errors = []
         if obj.foo > obj.bar:
             errors.append(ValidationProblem(obj, "Foo is larger than Bar"))
         # etc.
         super(ThisClass,obj).mdl_validate(obj,errors)
         return errors

As this is closer to "simplest thing that could possibly work" for such 
validation.  It gets right to the point, and also allows object-level 
validation to occur at a suitable checkpoint.  In my experience, most of 
the meaningful non-type based constraints for an object have to do with 
relationships between different features, and can't be validated in the 
context of setting a single feature, unless all you're changing in that 
transaction is a single feature.

And the *really* interesting constraints aren't even that simple, since 
they involve business rules that might be set by contextual matters, such 
as "what department is this purchase for" changing what rules are even 
applicable.  Ideally, I'd like the validation framework to be able to tap 
into those kinds of rules as well, but need to put some more thought into 
how to get at them.  Of course, it might be as simple as adding some code 
to mdl_validate that delegates to the department, but it'd be nice if one 
could just define rules externally and somehow have them apply.  Perhaps 
via some sort of lookup service on the data manager.  Anyway, it needs more 
thought.




More information about the PEAK mailing list