The PEAK Developers' Center   Diff for "DecoratorTools" UserPreferences
 
HelpContents Search Diffs Info Edit Subscribe XML Print View
Ignore changes in the amount of whitespace

Differences between version dated 2006-07-04 12:56:29 and 2010-05-06 15:20:17 (spanning 8 versions)

Deletions are marked like this.
Additions are marked like this.

#format rst
Class, Function, and Assignment Decorators for Python 2.3+
==========================================================
 
(NEW in 1.1: The ``struct()`` decorator makes it easy to create tuple-like data
structure types, by decorating a constructor function.)
Class, Function, and Assignment Decorators, Metaclasses, and Related Tools
==========================================================================
 
Want to use decorators, but still need to support Python 2.3? Wish you could
have class decorators, or decorate arbitrary assignments? Then you need
"DecoratorTools". Some quick examples::
have class decorators, decorate arbitrary assignments, or match decorated
function signatures to their original functions? Want to get metaclass
features without creating metaclasses? How about synchronized methods?
 
"DecoratorTools" gets you all of this and more. Some quick examples::
 
    # Method decorator example
    from peak.util.decorators import decorate

can mix and match decorators from this package with those provided by
zope.interface, TurboGears, etc.
 
For complete documentation, see the `DecoratorTools manual`_.
 
Changes since version 1.7:
 
  * The ``@template_function`` decorator now supports using a return value
    instead of a docstring, in order to work with the "-OO" option to Python;
    it's highly recommended that you update your template functions to use
    a return value instead of a docstring. (The error message has also been
    improved for the missing docstring case.)
 
  * Fixed metaclass collisions in ``classy`` subclasses that mix in abstract
    classes (e.g. ``collections.Sequence``) in Python 2.6+.
 
Changes since version 1.6:
 
  * Added ``synchronized`` decorator to support locking objects during method
    execution.
 
Changes since version 1.5:
 
  * Added ``classy`` base class that allows you to do the most often-needed
    metaclass behviors *without* needing an actual metaclass.
 
Changes since version 1.4:
 
  * Added ``enclosing_frame()`` function, so that complex decorators that call
    DecoratorTools functions while being called *by* DecoratorTools functions,
    will work correctly.
 
Changes since version 1.3:
 
  * Added support for debugging generated code, including the code generated
    by ``rewrap()`` and ``template_function``.
 
Changes since version 1.2:
 
  * Added ``rewrap()`` function and ``template_function`` decorator to support
    signature matching for decorated functions. (These features are similar to
    the ones provided by Michele Simionato's "decorator" package, but do not
    require Python 2.4 and don't change the standard idioms for creating
    decorator functions.)
 
  * ``decorate_class()`` will no longer apply duplicate class decorator
    callbacks unless the ``allow_duplicates`` argument is true.
 
Changes since version 1.1:
 
  * Fixed a problem where instances of different struct types could equal each
    other
 
Changes since version 1.0:
 
  * The ``struct()`` decorator makes it easy to create tuple-like data
    structure types, by decorating a constructor function.
 
.. _DecoratorTools Manual: http://peak.telecommunity.com/DevCenter/DecoratorTools#toc
 
.. _toc:
 
.. contents:: **Table of Contents**
 

    of the original class, so the decorator should return the input class if it
    does not wish to replace it. Example::
 
        from peak.util.decorators import decorate_class
 
        def demo_class_decorator():
            def decorator(cls):
                print "decorating", cls
                return cls
            decorate_class(decorator)
        >>> from peak.util.decorators import decorate_class
 
        class Demo:
            demo_class_decorator() # this will print "decorating <class Demo>"
        >>> def demo_class_decorator():
        ... def decorator(cls):
        ... print "decorating", cls
        ... return cls
        ... decorate_class(decorator)
 
        >>> class Demo:
        ... demo_class_decorator()
        decorating __builtin__.Demo
 
    In the above example, ``demo_class_decorator()`` is the decorator factory
    function, and its inner function ``decorator`` is what gets called to

    that ``decorate_class()`` should call ``sys._getframe()`` with, but this
    can be a bit trickier to compute correctly.
 
    Note, by the way that ``decorate_class()`` ignores duplicate callbacks::
 
        >>> def hello(cls):
        ... print "decorating", cls
        ... return cls
 
        >>> def do_hello():
        ... decorate_class(hello)
 
        >>> class Demo:
        ... do_hello()
        ... do_hello()
        decorating __builtin__.Demo
 
    Unless the ``allow_duplicates`` argument is set to a true value::
 
        >>> def do_hello():
        ... decorate_class(hello, allow_duplicates=True)
 
        >>> class Demo:
        ... do_hello()
        ... do_hello()
        decorating __builtin__.Demo
        decorating __builtin__.Demo
 
    
The ``synchronized`` Decorator
------------------------------
 
When writing multithreaded programs, it's often useful to define certain
operations as being protected by a lock on an object. The ``synchronized``
decorator lets you do this by decorating object methods, e.g.::
 
    >>> from peak.util.decorators import synchronized
 
    >>> class TryingToBeThreadSafe(object):
    ... synchronized() # could be just ``@synchronized`` for 2.4+
    ... def method1(self, arg):
    ... print "in method 1"
    ... self.method2()
    ... print "back in method 1"
    ... return arg
    ...
    ... synchronized() # could be just ``@synchronized`` for 2.4+
    ... def method2(self):
    ... print "in method 2"
    ... return 42
 
    >>> TryingToBeThreadSafe().method1(99)
    in method 1
    in method 2
    back in method 1
    99
 
What you can't tell from this example is that a ``__lock__`` attribute is being
acquired and released around each of those calls. Let's take a closer look::
 
    >>> class DemoLock:
    ... def __init__(self, name):
    ... self.name = name
    ... def acquire(self):
    ... print "acquiring", self.name
    ... def release(self):
    ... print "releasing", self.name
 
    >>> ts = TryingToBeThreadSafe()
    >>> ts.__lock__ = DemoLock("lock 1")
 
    >>> ts.method2()
    acquiring lock 1
    in method 2
    releasing lock 1
    42
 
    >>> ts.method1(27)
    acquiring lock 1
    in method 1
    acquiring lock 1
    in method 2
    releasing lock 1
    back in method 1
    releasing lock 1
    27
 
As you can see, if an object already has a ``__lock__`` attribute, its
``acquire()`` and ``release()`` methods are called around the execution of the
wrapped method. (Note that this means the lock must be re-entrant: that is,
you must use a ``threading.RLock`` or something similar to it, if you
explicitly create your own ``__lock__`` attribute.)
 
If the object has no ``__lock__``, the decorator creates a ``threading.RLock``
and tries to add it to the object's ``__dict__``::
 
    >>> del ts.__lock__
 
    >>> ts.method1(27)
    in method 1
    in method 2
    back in method 1
    27
 
    >>> ts.__lock__
    <_RLock(None, 0)>
 
(This means, by the way, that if you want to use synchronized methods on an
object with no ``__dict__``, you must explicitly include a ``__lock__`` slot
and initialize it yourself when the object is created.)
 
 
The ``struct()`` Decorator
--------------------------

    >>> v.c
    3
 
    >>> help(X)
    >>> help(X) # doctest: +NORMALIZE_WHITESPACE
    Help on class X:
    <BLANKLINE>
    class X(__builtin__.tuple)

     | __new__(cls, *args, **kw)
     |
     | ----------------------------------------------------------------------
     | Data descriptors defined here:
     | ...s defined here:
     |
     | a
     | a...
     |
     | b
     | b...
     |
     | c
     | c...
     |
     | ----------------------------------------------------------------------
     | Data and other attributes defined here:
     |
     | __args__ = ['a', 'b', 'c']
     | __args__ = ['a', 'b', 'c']...
     |
     | __star__ = None
     |

    >>> p.rest
    (2, 3, 4)
 
Internally, ``struct`` types are actually tuples::
 
    >>> print tuple.__repr__(X(1,2,3))
    (<class 'X'>, 1, 2, 3)
 
The internal representation contains the struct's type object, so that structs
of different types will not compare equal to each other::
 
    >>> def Y(a,b,c):
    ... return a,b,c
    >>> Y = struct()(Y)
 
    >>> X(1,2,3) == X(1,2,3)
    True
    >>> Y(1,2,3) == Y(1,2,3)
    True
    >>> X(1,2,3) == Y(1,2,3)
    False
 
Note, however, that this means that if you want to unpack them or otherwise
access members directly, you must include the type entry, or use a slice::
 
    >>> a, b, c = X(1,2,3) # wrong
    Traceback (most recent call last):
      ...
    ValueError: too many values to unpack
 
    >>> t, a, b, c = X(1,2,3) # right
    >>> a, b, c = X(1,2,3)[1:] # ok, if perhaps a bit unintuitive
 
The ``struct()`` decorator takes optional mixin classes (as positional
arguments), and dictionary entries (as keyword arguments). The mixin
classes will be placed before ``tuple`` in the resulting class' bases, and

    >>> def demo(a, b):
    ... return a, b
 
    >>> demo = struct(Mixin, reversed=property(lambda self: self[::-1]))(demo)
    >>> demo = struct(Mixin, reversed=property(lambda self: self[:0:-1]))(demo)
    >>> demo(1,2).foo()
    bar
    >>> demo(3,4).reversed

    27
 
 
Signature Matching
------------------
 
One of the drawbacks to using function decorators is that using ``help()`` or
other documentation tools on a decorated function usually produces unhelpful
results::
 
    >>> def before_and_after(message):
    ... def decorator(func):
    ... def decorated(*args, **kw):
    ... print "before", message
    ... try:
    ... return func(*args, **kw)
    ... finally:
    ... print "after", message
    ... return decorated
    ... return decorator
 
    >>> def foo(bar, baz):
    ... """Here's some doc"""
 
    >>> foo(1,2)
    >>> help(foo) # doctest: -NORMALIZE_WHITESPACE
    Help on function foo:
    ...
    foo(bar, baz)
        Here's some doc
    ...
 
    >>> decorated_foo = before_and_after("hello")(foo)
    >>> decorated_foo(1,2)
    before hello
    after hello
 
    >>> help(decorated_foo) # doctest: -NORMALIZE_WHITESPACE
    Help on function decorated:
    ...
    decorated(*args, **kw)
    ...
 
So DecoratorTools provides you with two tools to improve this situation.
First, the ``rewrap()`` function provides a simple way to match the signature,
module, and other characteristics of the original function::
 
    >>> from peak.util.decorators import rewrap
 
    >>> def before_and_after(message):
    ... def decorator(func):
    ... def before_and_after(*args, **kw):
    ... print "before", message
    ... try:
    ... return func(*args, **kw)
    ... finally:
    ... print "after", message
    ... return rewrap(func, before_and_after)
    ... return decorator
 
    >>> decorated_foo = before_and_after("hello")(foo)
    >>> decorated_foo(1,2)
    before hello
    after hello
 
    >>> help(decorated_foo) # doctest: -NORMALIZE_WHITESPACE
    Help on function foo:
    ...
    foo(bar, baz)
        Here's some doc
    ...
 
The ``rewrap()`` function returns you a new function object with the same
attributes (including ``__doc__``, ``__dict__``, ``__name__``, ``__module__``,
etc.) as the original function, but which calls the decorated function.
 
If you want the same signature but don't want the overhead of another calling
level at runtime, you can use the ``@template_function`` decorator instead.
The downside to this approach, however, is that it is more complex to use. So,
this approach is only recommended for more performance-intensive decorators,
that you've already debugged using the ``rewrap()`` approach. But if you need
to use it, the appropriate usage looks something like this::
 
    >>> from peak.util.decorators import template_function
 
    >>> def before_and_after2(message):
    ... def decorator(func):
    ... [template_function()] # could also be @template_function in 2.4
    ... def before_and_after2(__func, __message):
    ... return '''
    ... print "before", __message
    ... try:
    ... return __func($args)
    ... finally:
    ... print "after", __message
    ... '''
    ... return before_and_after2(func, message)
    ... return decorator
 
    >>> decorated_foo = before_and_after2("hello")(foo)
    >>> decorated_foo(1,2)
    before hello
    after hello
 
    >>> help(decorated_foo) # doctest: -NORMALIZE_WHITESPACE
    Help on function foo:
    ...
    foo(bar, baz)
        Here's some doc
    ...
 
As you can see, the process is somewhat more complex. Any values you wish
the generated function to be able to access (aside from builtins) must be
declared as arguments to the decorating function, and all arguments must be
named so as not to conflict with the names of any of the decorated function's
arguments.
 
The function template must return a static string that will be compiled into
a new function by DecoratorTools. The returned string must either fit on one
line, or begin with a newline and have its contents indented by at least two
spaces. The string ``$args`` may be used one or more times in the returned
string, whenever calling the original function. The first argument of the
decorating function must always be the original function.
 
Note, however, that function template is only called *once*, in order to get
this string, and it's called with dummy arguments. So the function must not
attempt to actually *use* any of its arguments, and must **always return a
static string**. Any attempt to insert the supplied arguments into the
template will result in an error::
 
    >>> def broken_decorator(func):
    ... [template_function()]
    ... def broken_template(__func, __message):
    ... # This doesn't work; don't do this:
    ... return '''
    ... print "before %(__message)s"
    ... try:
    ... return __func($args)
    ... finally:
    ... print "after %(__message)s"
    ... ''' % locals()
    ... return broken_template(func, "test")
 
    >>> broken_decorator(foo)
    Traceback (most recent call last):
      ...
    RuntimeError: template functions must return a static string!
 
 
Debugging Generated Code
------------------------
 
Both ``rewrap()`` and ``template_function`` are implemented using code
generation and runtime compile/exec operations. Normally, such things are
frowned on in Python because Python's debugging tools don't work on generated
code. In particular, tracebacks and pdb don't show the source code of
functions compiled from strings... or do they? Let's see::
 
    >>> def raiser(x, y="blah"):
    ... raise TypeError(y)
 
    >>> def call_and_print_error(func, *args, **kw):
    ... # This function is necessary because we want to test the error
    ... # output, but doctest ignores a lot of exception detail, and
    ... # won't show the non-errror output unless we do it this way
    ... #
    ... try:
    ... func(*args, **kw)
    ... except:
    ... import sys, traceback
    ... print ''.join(traceback.format_exception(*sys.exc_info()))
 
    >>> call_and_print_error(before_and_after("error")(raiser), 99)
    before error
    after error
    Traceback (most recent call last):
      File "<doctest README.txt[...]>", line ..., in call_and_print_error
        func(*args, **kw)
      File "<peak.util.decorators.rewrap wrapping raiser at 0x...>", line 3, in raiser
        def raiser(x, y): return __decorated(x, y)
      File ..., line ..., in before_and_after
        return func(*args, **kw)
      File "<doctest README.txt[...]>", line 2, in raiser
        raise TypeError(y)
    TypeError: blah
 
    >>> call_and_print_error(before_and_after2("error")(raiser), 99)
    before error
    after error
    Traceback (most recent call last):
      File "<doctest README.txt[...]>", line ..., in call_and_print_error
        func(*args, **kw)
      File "<before_and_after2 wrapping raiser at 0x...>", line 6, in raiser
        return __func(x, y)
      File "<doctest README.txt[...]>", line 2, in raiser
        raise TypeError(y)
    TypeError: blah
 
As you can see, both decorators' tracebacks include lines from the pseudo-files
"<peak.util.decorators.rewrap wrapping raiser at 0x...>" and "<before_and_after2
wrapping raiser at 0x...>" (the hex id's of the corresponding objects are
omitted here). This is because DecoratorTools adds information to the Python
``linecache`` module, and tracebacks and pdb both use the ``linecache`` module
to get source lines. Any tools that use ``linecache``, either directly or
indirectly, will therefore be able to display this information for generated
code.
 
If you'd like to be able to use this feature for your own code generation or
non-file-based code (e.g. Python source loaded from a database, etc.), you can
use the ``cache_source()`` function::
 
    >>> from peak.util.decorators import cache_source
    >>> from linecache import getline
 
    >>> demo_source = "line 1\nline 2\nline 3"
 
    >>> cache_source("<dummy filename 1>", demo_source)
    >>> getline("<dummy filename 1>", 3)
    'line 3'
 
The function requires a dummy filename, which must be globally unique. An easy
way to ensure uniqueness is to include the ``id()`` of an object that will
exist at least as long as the source code being cached.
 
Also, if you have such an object, and it is weak-referenceable, you can supply
it as a third argument to ``cache_source()``, and when that object is garbage
collected the source will be removed from the ``linecache`` cache. If you're
generating a function from the source, the function object itself is ideal for
this purpose (and it's what ``rewrap()`` and ``template_function`` do)::
 
    >>> def a_function(): pass # just an object to "own" the source
 
    >>> cache_source("<dummy filename 2>", demo_source, a_function)
    >>> getline("<dummy filename 2>", 1)
    'line 1\n'
 
    >>> del a_function # GC should now clean up the cache
 
    >>> getline("<dummy filename 2>", 1)
    ''
 
 
Advanced Decorators
-------------------
 

    when multiple decorators are used).
 
 
"Meta-less" Classes
-------------------
 
Sometimes, you want to create a base class in a library or program that will
use the data defined in subclasses in some way, or that needs to customize
the way instances are created (*without* overriding ``__new__``).
 
Since Python 2.2, the standard way to accomplish these things is by creating
a custom metaclass and overriding ``__new__``, ``__init__``, or ``__call__``.
 
Unfortunately, however, metaclasses don't play well with others. If two
frameworks define independent metaclasses, and a library or application mixes
classes from those frameworks, the user will have to create a *third* metaclass
to sort out the differences. For this reason, it's best to minimize the number
of distinct metaclasses in use.
 
``peak.util.decorators`` therefore provides a kind of "one-size-fits-all"
metaclass, so that most of the common use cases for metaclasses can be handled
with just one metaclass. In PEAK and elsewhere, metaclasses are most commonly
used to perform some sort of operations during class creation (metaclass
``__new__`` and ``__init__``), or instance creation (metaclass ``__call__``,
wrapping the class-level ``__new__`` and ``__init__``).
 
Therefore, the ``classy`` base class allows subclasses to implement one or more
of the three classmethods ``__class_new__``, ``__class_init__``, and
``__class_call__``. The "one-size-fits-all" metaclass delegates these
operations to the class, so that you don't need a custom metaclass for every
class with these behaviors.
 
Thus, as long as all your custom metaclasses derive from ``classy.__class__``,
you can avoid any metaclass conflicts during multiple inheritance.
 
Here's an example of ``classy`` in use::
 
    >>> from peak.util.decorators import classy, decorate
 
    >>> class Demo(classy):
    ... """Look, ma! No metaclass!"""
    ...
    ... def __class_new__(meta, name, bases, cdict, supr):
    ... cls = supr()(meta, name, bases, cdict, supr)
    ... print "My metaclass is", meta
    ... print "And I am", cls
    ... return cls
    ...
    ... def __class_init__(cls, name, bases, cdict, supr):
    ... supr()(cls, name, bases, cdict, supr)
    ... print "Initializing", cls
    ...
    ... decorate(classmethod) # could be just @classmethod for 2.4+
    ... def __class_call__(cls, *args, **kw):
    ... print "before creating instance"
    ... ob = super(Demo, cls).__class_call__(*args, **kw)
    ... print "after creating instance"
    ... return ob
    ...
    ... def __new__(cls, *args, **kw):
    ... print "new called with", args, kw
    ... return super(Demo, cls).__new__(cls)
    ...
    ... def __init__(self, *args, **kw):
    ... print "init called with", args, kw
    My metaclass is <class 'peak.util.decorators.classy_class'>
    And I am <class 'Demo'>
    Initializing <class 'Demo'>
 
    >>> d = Demo(1,2,a="b")
    before creating instance
    new called with (1, 2) {'a': 'b'}
    init called with (1, 2) {'a': 'b'}
    after creating instance
 
Note that because ``__class_new__`` and ``__class_init__`` are called *before*
the name ``Demo`` has been bound to the class under creation, ``super()``
cannot be used in these methods. So, they use a special calling convention,
where the last argument (``supr``) is the ``next()`` method of an iterator
that yields base class methods in mro order. In other words, calling
``supr()(..., supr)`` invokes the previous definition of the method. You MUST
call this exactly *once* in your methods -- no more, no less.
 
``__class_call__`` is different, because it is called after the class already
exists. Thus, it can be a normal ``classmethod`` and use ``super()`` in the
standard way.
 
Finally, note that any given ``classy`` subclass does NOT need to define all
three methods; you can mix and match methods as needed. Just be sure to always
use the ``supr`` argument (or ``super()`` in the case of ``__class_call__``).
 
 
Utility/Introspection Functions
-------------------------------
 

    are no base classes, you should just directly use the module-level
    ``__metaclass__`` or ``types.ClassType`` if there is none.
 
enclosing_frame(frame=None, level=3)
    Given a frame and/or stack level, skip upward past any DecoratorTools code
    frames. This function is used by ``decorate_class()`` and
    ``decorate_assignment()`` to ensure that any decorators calling them that
    were themselves invoked using ``decorate()``, won't end up looking at
    DecoratorTools code instead of the target. If you have a function that
    needs to be callable via ``decorate()`` and which inspects stack frames,
    you may need to use this function to access the right frame.
 
 
Mailing List
------------

PythonPowered
ShowText of this page
EditText of this page
FindPage by browsing, title search , text search or an index
Or try one of these actions: AttachFile, DeletePage, LikePages, LocalSiteMap, SpellCheck