[PEAK] Moving forward with PyProtocols and generic functions

Phillip J. Eby pje at telecommunity.com
Wed Nov 3 18:47:53 EST 2004

About three weeks ago, I floated a proposal for a simplified "convenience" 
declaration API in PyProtocols:


So far, the only feedback has been from Ulrich, who expressed confusion 
about the 'performs' proposal, and opined that 'for_types' and 
'for_protocols' were less explicit than the current 'asAdapterForX' 
keywords.  So, here's a revised proposal:

     def foo(x,y,z):
         """Do whatever IFoo.__call__ says this should do"""

     @protocols.adapter_function(IFoo, adapts_types=[int])
     def int_as_foo(anInteger):
         # return an IFoo implementation for anInteger

     class Foo:
            IFoo, adapts_protocols=[IBar]

I'm assuming that the other items I proposed, like module_provides and 
class_provides, are remaining the same as I proposed them.  Also, keep in 
mind that in Python 2.2 and 2.3, the '@' usages above will be spelled:

     def foo(x,y,z):
         """Do whatever IFoo.__call__ says this should do"""

     [protocols.adapter_function(IFoo, adapts_types=[int])]
     def int_as_foo(anInteger):
         # return an IFoo implementation for anInteger

Any comments on these proposals?

Also, after much reflection on where generic functions should go, I think 
I've settled on creating another package, but it will be called "dispatch" 
rather than "generics", and it will still be part of the PyProtocols 
distribution.  Like 'protocols', it will also be considered part of the 
PEAK core API, so you'll be able to do, e.g.:

     from peak.api import *

     def whatever(some):
         # blah

I will also move the currently-experimental 'as' decorator to the 
'dispatch' package, so that e.g.:

     def something(cls, etc):

can be used as a substitute for @classmethod (or 'something = 
classmethod(something)' in current Python versions.

My current plan for the dispatch package is to include not only the current 
predicate-dispatch prototype there, but also a simple protocol-based 
single-dispatch generic function.  Basically, any place in PEAK where we 
have an interface with only one method, it's an excellent candidate for 
replacement with a single-dispatch generic function.  For example, here's 
some recent code from peak.config:

# interfaces.py

class IStreamSource(Interface):
     """A way to load configuration from a file, URL, or other stream"""

     def getFactory(context):
         """Return a 'naming.IStreamFactory', using 'context' for any 

# config_components.py

class StreamSource(protocols.Adapter):


     def getFactory(self, context):
         from peak.naming.factories.openable import FileFactory,FileURL
             url = FileURL.fromFilename(self.subject)
         except exceptions.InvalidName:
             url = naming.toName(self.subject, FileURL.fromFilename)
         if isinstance(url,FileURL):
             return FileFactory(filename=url.getFilename())
         return naming.lookup(context,url)

class FactorySource(protocols.Adapter):


     def getFactory(self, context):
         return self.subject

If the above were implemented using a single-dispatch generic function, it 
would look something like:

getStreamFactory = dispatch.SimpleGeneric(
     """Return a 'naming.IStreamFactory' for 'source'


         factory = config.getStreamFactory(source,context)

     Built-in cases::

         If 'source' is a 'naming.IStreamFactory', it is simply returned.
         If it is a string or Unicode object, it will be interpreted as
         either a filename or URL.  If it is a URL, it will be looked up
         in 'context'.

     You may define additional cases for this function using
     'dispatch.when', e.g.::

         from peak.config.api import getStreamFactory

         def getStreamFactory(source,context):
             '''Return a stream factory for 'source' (a 'MyType' instance)'''

def getStreamFactory(source,context)
     from peak.naming.factories.openable import FileFactory,FileURL
         url = FileURL.fromFilename(source)
     except exceptions.InvalidName:
         url = naming.toName(source, FileURL.fromFilename)
     if isinstance(url,FileURL):
         return FileFactory(filename=url.getFilename())
     return naming.lookup(context,url)

def getStreamFactory(source, context):
     return source

This is not only more succinct (code-wise), but it also doesn't require 
creating an adapter instance each time the function is invoked.  (It will 
create a temporary tuple down in the implementation somewhere, but that's 
of little consequence.)

The above is not a finalized dispatch API, though.  For example, I'm not 
sure that you'll be able to use a list or a protocol directly as arguments 
to 'when'.  I'm not even positive you'll be able to use 'when' with 
single-dispatch functions, or whether there will be separate declarations 
for single and multiple-dispatch generics.

Anyway, as you can see, this approach would eliminate the need for many 
"one-method" interfaces in PEAK, while making it more obvious how to extend 
many of PEAK's APIs.  Documentation is also more straightforward, because 
we can describe the "built-in" behaviors in the docstring for the generic 
function itself.  (I've never felt comfortable with such "doc bundling" for 

In addition to these things, I'll also be working to finish out the 
predicate-dispatch generic functions.  One currently annoying issue is that 
predicate dispatch functions don't respect 'adapt()'-ability fully.  That 
is, they 1) do not use __conform__ and 2) the selected implementation is 
passed the original object, not an adapted object.  In effect, 'when("x in 
IFoo")' simply means that 'x' will be an instance of a class that can be 
*adapted* to IFoo, not that it is an object whose __conform__() returns a 
value for IFoo, or that the 'x' received by the method actually implements 
IFoo.  In effect, you need to always do:

      [when("x in IFoo")]
      def something(x,y):
          x = IFoo(x)

before using the 'x' in question.  However, this is *not* the case for 
single-dispatch generic functions, which can be implemented directly via 
'adapt()' (since there's only one parameter to dispatch on).

I'm not sure what to do about this, though, because 1) '__conform__' calls 
can't really be indexed in any meaningful way, and 2) I don't see a 
straightforward way to allow a multi-dispatch function's criteria to munge 
the function's parameters.  That might have to be something that's done by 
some sort of wrapper to the individual method, but that adds another 
calling layer versus just adding the explicit 'x = IFoo(x)' to the top of 
the method.  The only time it would be more efficient to do parameter 
munging is if there are other very expensive calculations that are part of 
the predicate, whose values are then needed in the method body.

I did, however, previously have an idea for such a syntax:

     [when("let(x=IFoo(x,None)) in (x is not None)")]

I was originally thinking that this would mainly be to make it easier to 
refer to a complex expression more than once, but in principle it could 
also be used to actually change or add arguments to specialized cases.

Of course, once such a mechanism existed, it could perhaps be exploited to 
let criteria do the munging as well, such that:

      [when("x in IFoo")]

is effectively shorthand for:

     [when("let(x=IFoo(x,None)) in (x is not None)")]

This still doesn't address '__cohasattr(xnform__', but it's progress.

Finally, there's one other feature I plan to add, and that's the ability to 
use generic functions as methods in a class.  After much thinking recently 
about schema mappings, view registration, and other matters, I've concluded 
that the "best" way to do these things is by defining generic functions in 
a class, and then subclassing it to form alternate contexts.  For example, 
I could create a class that implements a relational mapping workspace over 
a generic schema, then subclass it to create a specific relational mapping 
that overrides some defaults.

The way that this would work is that since methods take a 'self' argument, 
we can actually use the *same* generic function for *all subclasses* of the 
original class.  However, if we tweak generic functions so that 
'SomeClass.someGeneric' returns a "bound" generic function that 
automatically adds "and self in SomeClass" to the criteria, then cases 
defined for subclasses override those defined in superclasses (as long as 
the remaining criteria are as specific or more specific than the criteria 
being overridden).

Of course, for many of these uses, you'll never directly add cases to the 
generic functions; you'll likely use domain-specific higher-order functions 
like 'registerView()' or 'mapTable()' that will add closures to the generic 

Anyway, the 'when()' function will still probably get some additional 
tweaking to make it relatively easy to add these "override cases" to a 
generic method in a subclass, when you're not using domain-specific APIs to 
define them.

This ability to bundle generic functions into classes and to subclass with 
selective overriding should make it possible to create some really 
interesting frameworks, especially those that are metadata driven.  For 
example, I expect to eventually replace virtually all of the current 
peak.security implementation this way, because all the funky adapters and 
temporary protocols should be replaceable with generic functions built into 
the Interaction, and easy to extend/override rules in subclasses.

All of this generic function stuff is pretty much a requirement for future 
work on peak.web, which really "wants" views and menus (and possibly 
certain other things) to be generic function-based.  It's also a 
prerequisite for implementing the "new metadata" concept I described in:


Even though that post spoke in terms of adapting to protocols, the API and 
implementations will be cleaner if classes with generic functions are 
used.  This is because "interface inheritance" is in the "opposite 
direction" from functional inheritance.  Currently, to inherit the 
functionality of already-defined adapters, one creates a 
protocol.Variation() of the existing protocol.  However, to do the same 
thing with generic functions, one can simply subclass the class that 
contains the generic functions.  So, if one has a 'Syntax' class that holds 
default syntax metadata for arbitrary application schemas, one can easily 
subclass it to create e.g. 'XMLSyntax', and add override cases to its 
metadata functions.

This is a more straightforward approach than having to deal in some sort of 
"registry" or "context" instances and deriving them from each other 
somehow.  First, it's obvious what the objects do, because they're objects 
you can use to do something.  For example, one might say 's = 
Syntax(SomeClass)' to create a Syntax instance that knows how to parse and 
format instances of SomeClass.  Second, it's obvious how to extend them, 
because you just subclass them.

Anyway, that's the current PyProtocols state of the union, and plans for 
the future.  Comments, anyone?

More information about the PEAK mailing list