# Partially applied functions and decorators in python

Thu 03 October 2013

This is not a quick tutorial. It will tell you all you need to know about decorators, but slowly build ups the necessary theoretical and technical background in terms of partially applied functions, closures and lexical scoping before eventually discussing decorators.

## Partial application of functions

As per the wikipedia article on Partial Application,

In computer science, partial application (or partial function application) refers to the process of fixing a number of arguments to a function, producing another function of smaller arity. Given a function f : (X × Y × Z) → N , we might fix (or 'bind') the first argument, producing a function of type partial(f) : (Y × Z) → N . Evaluation of this function might be represented as fpartial(2, 3). Note that the result of partial function application in this case is a function that takes two arguments.

Thus if you have a function f with arguments x1, x2, ..xn, a partial application technique would allow you to

• Create a partially applied function fpartial specifying only a subset of the arguments x1, x2, ..xn. Assuming them to be the first two, it would be fpartial = f(x1, x2)
• Use the partially applied function with the remainder of the arguments to get the same result as one would have got from the original function. Continuing with the above assumption that would allow you to do fpartial(x3, x4, ..., xn)

## Partial application in python

```def add(a,b) :
return a + b

```
```Adding 2 and 3 results in 5
```

Let us look at two different ways of partial application in python

### Partial application using nested functions

```def add(a) :
return a + b

print("Calling add with 2 alone returns {}, a {} with the name '{}'".format(
```
```Adding 2 and 3 results in 5
Calling add with 2 alone results in <function add_a_to at 0x33eb848> which is a <type 'function'> with the name 'add_a_to'
```

Note the following :

• In this implementation we changed the calling syntax from add(2,3) to add(2)(3)
• This is because we now make two function calls. The first call is add(2) which results in a function being returned called add_a_to . We then call add_a_to(3). However since we just chain the function calls together, it is succinctly written as add(2)(3)
• The implementation technique of nesting one function inside another is often referred to as nested functions. It requires structural modification of the function definition itself.

### Partial application using functools.partial

This technique does not require us to rewrite the function differently. It does allow us to first pass any arbitrary subset of arguments (eg. the second argument ie, 3 above), create a partially applied function using functools.partial and then use that to subsequently invoke the function with the remainder of the arguments. For reference see functools.partial in python documentation

```from functools import partial

return a + b

print("add partially applied with 2nd argument as 3 results in {}".format(
print("Adding 2 to 3 using partial application results in {}".format(
print("Partial applications using functools.partial can be inlined, " +
"though unlikely to be used so.")
```
```add partially applied with 2nd argument as 3 results in <functools.partial object at 0x32c8470>
Adding 2 to 3 using partial application results in 5
Partial applications using functools.partial can be inlined, though unlikely to be used so.
2 + 3, ie. partial(add,b=3)(2) = 5
```

## Why is partial application useful ?

Partial application helps us implement spatial and temporal separation.

• Spatial separation : Partial argument lists can be specified at different places in the code. This might be convenient from just a structuring perspective or perhaps from the perspective of lexical scoping.
• Temporal separation: Partial argument lists can be partially applied at different times in the execution of the code. This is convenient say when some arguments are to be included in the code itself, some are to be partially applied only at the program startup configuration time and some others might be required at runtime.

## A tour of closures and lexical scope

Typical usage of decorators leverages the capabilities afforded by closures and lexical scope. As a result let us take a quick side tour of these.

In programming languages, a closure (also lexical closure or function closure) is a function or reference to a function together with a referencing environment—a table storing a reference to each of the non-local variables (also called free variables or upvalues) of that function. A closure—unlike a plain function pointer—allows a function to access those non-local variables even when invoked outside its immediate lexical scope.

Also on the page of Scope : Lexical vs. dynamic scoping we can find the words

In lexical scoping (or lexical scope; also called static scoping or static scope), if a variable name's scope is a certain function, then its scope is the program text of the function definition: within that text, the variable name exists, and is bound to the variable's value, but outside that text, the variable name does not exist

Let us explore this in the context of python programs

```var1 = 3
var2 = 4

def foo() :
var2 = 5
print("Value of var2 is {}".format(var2))
print("Global var1={}, var2={}".format(globals()["var1"], globals()["var2"]))
print("Locals are {}".format(locals()))
var1 = 9
print("Value of var1 is {} and var2 is {}".format(var1, var2))
print("Global var1={}, var2={}".format(globals()["var1"], globals()["var2"]))
print("Locals are {}".format(locals()))

foo()
print("Outside the function, value of var1 is {} and var2 is {}".format(var1, var2))
```
```Value of var2 is 5
Global var1=3, var2=4
Locals are {'var2': 5}
Value of var1 is 9 and var2 is 5
Global var1=3, var2=4
Locals are {'var1': 9, 'var2': 5}
Outside the function, value of var1 is 3 and var2 is 4
```

I would like to highlight the following

• Variables declared in the global scope are accessible to a function explicitly via the globals() function
• Variables declared within the function scope are accessible to a function either implicitly or explicitly via the locals() function
• A global variable redeclared within the function scope results in two variables, one in the global and another in local scope. However the local binding shadows the global binding.
• When a function declaration concludes, its scope is over and the local bindings in that function cease to be relevant

Now let us take a look at another program which uses nested functions and will require us to revisit the last point I make above.

```var = 5

def outer(arg):
var = arg
def inner() :
print("The value of var is {}".format(var))
print("local value of var is {}. Now being incremented".format(var))
var = arg + 1
return inner

print("The value of the global var is {}".format(var))
nested1 = outer(6)
nested2 = outer(99)
print("The free variables carried by nested1 from outer scope are {}".format(
zip(nested1.__code__.co_freevars, (c.cell_contents for c in nested1.__closure__))))
print("The free variables carried by nested2 from outer scope are {}".format(
zip(nested2.__code__.co_freevars, (c.cell_contents for c in nested2.__closure__))))
nested1()
nested2()
print("The value of the global var still is {}".format(var))
```
```The value of the global var is 5
local value of var is 6. Now being incremented
local value of var is 99. Now being incremented
The free variables carried by nested1 from outer scope are [('var', 7)]
The free variables carried by nested2 from outer scope are [('var', 100)]
The value of var is 7
The value of var is 100
The value of the global var still is 5
```

Now note the following :

• var is created in the global scope, its value is set to 5 and remains 5 throughout the program
• There is another var created within the scope of the outer function which shadows the global var within the lexical scope of outer.
• The local value of var is initialised to the value passed as a parameter to outer. That is the value visible to inner() at declaration time
• The local value of var is incremented after the inner function has been declared.
• In general the local value of var declared is no longer valid once the function declaration of outer ends, since thats where its lexical scope ends.
• However if a nested function is declared and returned from outer (in this case inner), then such a function continues to carry the variables available to it at declaration time. This is exactly what closures are, ie. the function inner closes over the scope of the function outer.
• The variable value in the closure is not a snapshot of the value at the point in time the function was declared. It will reflect any changes after the inner function was declared but before the outer function got over
• The variable values are passed to the nested function via its _closure_ attribute (variable names via _code_.co_freevars)

To summarise the necessary learnings :

• Nested functions help us implement partial application of functions
• Any variables declared in the lexical scope of the outer nested functions are available to the inner nested functions even after the outer function declaration is over.
• A new local scope gets created whenever a outer nested function is called. Any returned inner functions, as a result multiple outer function invocations, get independent copies of the local scope of each execution of the outer function.

## How is partial application useful ?

There are a number of uses of partial applications. Some of these are

### Creating specialised functions from general functions

Many a times functions we can easily associate with are but specialised versions of more general functions. Writing more general functions allows is a better way to follow the Do not repeat yourself (DRY) principle. Specialising them can allow for easier readibility when using them, since they can be more intuitive to understand at call site.

A simple generic function might be declared as follows

```def exponent_of(base, exponent) :
return base ** exponent

print("3 to the power 4 is {}".format(
exponent_of(3,4)))
```
```3 to the power 4 is 81
```

However a more specialised function might make for easier reading. While you could just as easily use functools.partial, the example below shows partial application using nested functions

```def nth_power(exponent) :
def exponent_of(base) :
return base ** exponent
return exponent_of

square = nth_power(2)
cube = nth_power(3)

print("The square of 3 is {}".format(square(3)))
print("The cube of 4 is {}".format(cube(4)))
```
```The square of 3 is 9
The cube of 4 is 64
```

### Temporal separation of function arguments

In some cases, a function may need many arguments, some of which are specified and available at different points in time than others. A usual way of dealing with this is to carry forward all the available arguments to the eventual call site. Sometimes those that are specified early on are rarely if ever modified later. It becomes inconvenient to carry around all these variables in the namespace. An alternative is to create a partially applied function and carry forward only the partially applied function to the eventual call site. eg.

```# Conventional method

# do query
# return results

# .. lot of code here .. eventually reaching

# Alternative method
def do_query(query) :
# do query
# return results
return do_query

def bar(querying_func):
return func(querying_func)

def foo(querying_func) :
return bar(querying_func)

# .. much further down the line
foo(query_agent)
```

### Use closures instead of classes

It is not uncommon to see classes with only one significant public method. In many such scenarios, the class constructor is used to specify the arguments of the method, and the method itself is used to perform the desired behaviour. This gives us temporal separation (object construction and method execution) and encapsulation of the arguments. It so happens the same can be done through using closures as well. (Note: although the example below is quite similar to the one above they serve to describe different intents)

```# traditional method

class Connection(object):
self.userid = self.userid

def execute(sql):
# execute SQL using userid, password.
# return results

c.execute("select 'x' from dual;")

# Using closures

def execute(sql):
# execute SQL using userid, password & sql
# return results
return execute

c("select 'x' from dual;")
```

### Aside: Currying vs. Partial Applications

As per the wikipedia article on Currying,

In mathematics and computer science, currying is the technique of transforming a function that takes multiple arguments (or a tuple of arguments) in such a way that it can be called as a chain of functions, each with a single argument (partial application).

It further goes on to define it as follows

Given a function f of type f : (X × Y) → Z , currying it makes a function curry(f) : X → (Y → Z) . That is, curry(f) takes an argument of type X and returns a function of type Y → Z .

With a further view to disambiguate the term from partial applications, it goes on to state that

Currying and partial function application are often conflated. One of the significant differences between the two is that a call to a partially applied function returns the result right away, not another function down the currying chain; this distinction can be illustrated clearly for functions whose arity is greater than two.

Given a function of type f : (X × Y × Z) → N , currying produces curry(f) : X → (Y → (Z → N)) . That is, while an evaluation of the first function might be represented as f(1, 2, 3), evaluation of the curried function would be represented as fcurried(1)(2)(3), applying each argument in turn to a single-argument function returned by the previous invocation. Note that after calling fcurried(1), we are left with a function that takes a single argument and returns another function, not a function that takes two arguments.

Currying thus is just a special case of partial application, where we partially apply the arguments, one argument at a time, in the order in which they are declared. In general currying (especially as defined above) is not a useful notion to attempt to use in python. However you might find it getting used in context of python, and more often than not, it might mean partial application. How would currying actually look like, well for the curious, here's a function that can curry another

```def add(a,b,c) :
return a +  b + c

def curry(func) :
args = []
if func.__code__.co_argcount == 0 :
return func()
def wrap(arg) :
args.append(arg)
if len(args) == func.__code__.co_argcount :
return func(*args)
else :
return wrap
return wrap

```
```12
```

## Decorators

Decorators were introduced as a syntactic addition to python language in PEP-318. It is often considered that the name decorators comes from the gang of four book on Design Patterns. The PEP in fact clarifies it is not so (and acknowledges that the name has drawn its share of criticism and perhaps a better name will come up eventually - it did not).

The name 'decorator' probably owes more to its use in the compiler area -- a syntax tree is walked and annotated

I've heard it being discussed that decorators are just nested or partially applied functions. While that is accurate, it misses an important distinction. Although both use the same technique, partial application is usually used to split a task into two. Decorators on the other hand are used to wrap one or more add-on tasks around a core task. So there are many many scenarios where you would use partially applied functions, but you would not want to use decorators.

```def add(a, b):
return a + b

# Note the following.
# we shall contrast this later when function wrapping kicks in
import inspect
print("The specification for add is name={} args={}".format(
```
```Adding 2 and 3 results in 5
```

We shall now decorate this function with a tracer. A function which can print entry and exit to the desired function

```def trace(func) :
def wrapper(*args, **kwargs) :
print("Entering {} with arguments {} {}".format(
func.__name__, args, kwargs))
ret = func(*args, **kwargs)
print("Leaving {} with result {}".format(func.__name__, ret))
return ret
return wrapper

@trace  # <--- This is how a decorator is applied
return a + b

import inspect
print("The specification for add is name={} args={}".format(
```
```Entering add with arguments (2, 3) {}
Adding 2 and 3 results in 5
The specification for add is name=wrapper args=ArgSpec(args=[], varargs='args', keywords='kwargs', defaults=None)
```

Note that applying the trace function as a decorator using "@trace" did not just create a new function. It did create a new wrapped function, but also replaced the function in the namespace for the name "add" with the new wrapped function. The original function add is no longer available to be directly accessed from the namespace.

Decorators are just the outer function in a nested function. The decorated function is essentially a closure which retains access to the variables as registered by the decorator in its local namespace and the rules of how such variables are lexically scoped but accessible by closures continue to apply.

The function name is now changed if you were to inspect it. So add._name_ now returns wrapper and not add. That can be corrected by using the @wraps decorator from the functools module.

```from functools import wraps

def trace(func) :
@wraps(func) # <-- Applying the wraps decorator helps in the wrapped function retaining the original name
def wrapper(*args, **kwargs) :
print("Entering {} with arguments {} {}".format(
func.__name__, args, kwargs))
ret = func(*args, **kwargs)
print("Leaving {} with result {}".format(func.__name__, ret))
return ret
return wrapper

@trace
return a + b

import inspect
print("The specification for add is name={} args={}".format(
```
```Entering add with arguments (2, 3) {}
Adding 2 and 3 results in 5
```

Here thankfully the name of the wrapped function is now add should anyone care to inspect the function. The argument specification still remains different. And it should if trace has to be a generic function which can trace all functions, not just those which have two parameters.

The takeaway is that because decorators implement functions in place, some information about the functions can get lost to function inspectors. The same would not happen if you chose to explicitly wrap a function without using the decorator syntax as in the following. There are other effects as well eg. you can no longer easily unit test the original undecorated function. If the decorator introduces side effects or say changes the return value you have to deal with it in your unit tests.

The code below once again demonstrates the scenario where trace is used as a wrapper, without actually using it as a decorator

```def trace(func) :
def wrapper(*args, **kwargs) :
print("Entering {} with arguments {} {}".format(
func.__name__, args, kwargs))
ret = func(*args, **kwargs)
print("Leaving {} with result {}".format(func.__name__, ret))
return ret
return wrapper

return a + b

# Not traced

import inspect
print("The specification for add is name={} args={}".format(
print("The specification for trace is name={} args={}".format(
trace.__name__, inspect.getargspec(trace)))
```
```Adding 2 and 3 results in 5
Entering add with arguments (4, 5) {}
Adding 4 and 5 results in 9
The specification for trace is name=trace args=ArgSpec(args=['func'], varargs=None, keywords=None, defaults=None)
```

### Decorators with arguments

Once one starts using decorators, pretty soon one wants to parameterise their behaviour. For example we might want to decide whether to log the entry/exit arguments and return values at the point where we apply the trace decorator to each function. I have seen people struggle with decorators when it gets to using decorators with arguments. But if one understands the basics of how decorators work, it is actually very easy to figure it out.

Part of the confusion stems from what exactly is a decorator. For the purposes of this discussion a decorator is a function which takes exactly one argument, a function, and returns exactly one another, which is also a function. If you want to parameterise a decorator, the way (using a term used by the PEP) is to create a decomaker. A decomaker is a function which takes a bunch of arguments and return a decorator.

Let us see the parameterised implementation of trace below. Under the definitions we just laid down, trace is no longer a decorator. It is a decomaker. ie. it takes a bunch of arguments and returns a decorator. That decorator in turn decorates the method add.

For the remainder of this post, I shall use decorator as a generic placeholder for both decorators and decomakers (as generally most writings in python do). But hopefully the context should allow you to disambiguate as necessary.

```from functools import wraps

def trace(trace_arguments = False) :
def decorator(func) :
@wraps(func) # <-- how the wrapped function retains the original name
def wrapper(*args, **kwargs) :
if trace_arguments :
print("Entering {} with arguments {} {}".format(
func.__name__, args, kwargs))
else :
print("Entering {}".format(func.__name__))
ret = func(*args, **kwargs)
if trace_arguments :
print("Leaving {} with result {}".format(func.__name__, ret))
else :
print("Leaving {}".format(func.__name__))
return ret
return wrapper
return decorator

@trace(True)
return a + b

@trace()
def subtract(a, b) :
return a - b

print("5 minus 3 is {}".format(subtract(5,3)))
```
```Entering add with arguments (3, 5) {}
3 plus 5 is 8
Entering subtract
Leaving subtract
5 minus 3 is 2
```

### Composing decorators

Unsurprisingly decorators can be composed. After all they are just functions which accept one function and return another function, so composing should be trivial. This is often popularly referred to as chaining.

Here's a example

```from functools import wraps

def trace(trace_arguments = False) :
def decorator(func) :
@wraps(func)
def wrapper(*args, **kwargs) :
if trace_arguments :
print("Entering {} with arguments {} {}".format(
func.__name__, args, kwargs))
else :
print("Entering {}".format(func.__name__))
ret = func(*args, **kwargs)
if trace_arguments :
print("Leaving {} with result {}".format(func.__name__, ret))
else :
print("Leaving {}".format(func.__name__))
return ret
return wrapper
return decorator

# using a global for brevity
function_calls = 0

def track(func) :
@wraps(func)
def wrapper(*args, **kwargs):
global function_calls
function_calls = function_calls + 1
return func(*args, **kwargs)
return wrapper

@track
@trace(True)
return a + b

print("Function calls so far {}".format(function_calls))
print("Function calls so far {}".format(function_calls))
```
```Function calls so far 0
Entering add with arguments (3, 5) {}
3 + 5 = 8
Entering add with arguments (2, 7) {}
2 + 7 = 9
Function calls so far 2
```

## Use of function decorators

Function decorators are frequently used and are quite helpful. I doubt if I can create a comprehensive set of use cases, but will list some of the ones where I've observed decorators are useful.

### Applying aspects to functions

Decorators are frequently used as an implementation technique for some facets of Aspect Oriented Programming. The trace decorator (or decomaker) I described above implements a tracing aspect. You could imagine other decorators which for example update global counters (to keep track of how many web requests have been served), authorisation decorators for implementing ACLs, or memoization ie. to cache results computed earlier and serve them if the function was invoked with the same arguments again. eg. functools.lru_cache is a memoization decorator.

### Autoregistration of function handlers

Many web frameworks allow you to configure the url route around its handler function using a decorator as below.

```@route("/show")
def show(request) :
# process the request and return the generated HTML
```

In this case there is a global routes repository list which contains the routes, and their handler functions. In the above example "/show" is the url_prefix that if matched should result in the show handler function being called. The routes engine is matches an incoming request to the routes in the list, and if a match is found, then the corresponding handler function is called giving it the request as an argument. Without decorators, you would need a separate place where all the routes and their handlers could be configured (sometimes called wiring up). With decorators like above, this configuration can be done at the site of each function declaration. (The part where the runtime discovers all such functions and calls them once in order to actually perform the registration is beyond the scope of this post). We could use decorators similarly to register event listeners etc.

Sometimes we want to mark a function as having a specific characteristic so that it can be processed differently at runtime. eg. celery marks each task handler using @app.task decorator. One might want to assign more metadata to the function. Decorators can also be used for basic housekeeping (eg. the wraps decorator seen earlier). Python itself requires us to use @staticmethod and @classmethod decorators to mark appropriate methods on a class.

## Class decorators

Class decorators were introduced in Python 2.6 using PEP-3129. A class decorator is a function which gets a python class as an input. It performs the necessary decoration activities on the class and returns the decorated class. Much of what class decorators achieve can also be achieved through metaprogramming, and more often than not, being able to write good class decorator requires a decent understanding of the class meta model.

The following example demonstrates how a class decorator can be used to substitute a class constructor to print a statement and then eventually call the original class constructor.

```def trace(class_):
original_init = class_.__init__
def __init__(self, *args, **kws):
print("Instantiating an object of class {}".format(class_.__name__))
original_init(self, *args, **kws)
class_.__init__ = __init__
return class_

@trace
class Foo(object):
def __init__(self, value):
self.value = value

foo = Foo(5)
print("The value of foo is {}".format(foo.value))
```
```Instantiating an object of class Foo
The value of foo is 5
```