Skip to main content
  1. Readings/
  2. Books/
  3. Fluent Python: Clear, Concise, and Effective Programming – Luciano Ramalho/

Chapter 7. Functions as First-Class Objects

··4425 words·21 mins
Table of Contents

Definition of a First Class Object #

Programming language researchers define a “first-class object” as a program entity that can be:

  • Created at runtime
  • Assigned to a variable or element in a data structure
  • Passed as an argument to a function
  • Returned as the result of a function

What’s New in This Chapter #

Treating a Function Like an Object #

  • the __doc__ attribute is typically used for the help(<fn_name>)

  • having fist-class functions enables programming in a functional style

Higher-Order Functions #

  • argument and return type are both functions

  • apply has been deprecated because we can define **kwargs now

    so a function can be defined as fn(*args, **kwargs)

Modern Pythonic Replacements for map, filter, and reduce #

  • map and filter return generators \(\implies\) we can just directly define genexps now instead.

  • reduce is no longer a builtin, KIV this for a later chapter in this book.

Anonymous Functions #

  • The best use of anonymous functions is in the context of an argument list for a higher-order function.

    e.g. sorted(fruits, key=lambda word:word[::-1])

python anon functions have to be pure functions, limited language features available in lambda definition #

However, the simple syntax of Python limits the body of lambda functions to be pure expressions. In other words, the body cannot contain other Python statements such as while, try, etc. Assignment with = is also a statement, so it cannot occur in a lambda.

:= assignment syntax exists #

The new assignment expression syntax using := can be used—but if you need it, your lambda is probably too complicated and hard to read, and it should be refac‐ tored into a regular function using def.

⭐️ The Nine Flavors of Callable Objects #

  • () is the call operator
  • genrators, native coroutines and async generator functions are different from the rest of the callables types in that their return values are never applicaitons data, it’s objects that need further processing to yiled application data / do useful work.

1) callable: User-defined Functions #

2) callable: builtin functions #

e.g. len

3) callable: builtin methods #

methods implemented in C e.g. dict.get

4) callable: methods: #

fns defined in the body of a class

5) callable: Class #

  • classes are callable, it calls __new__ \(\rightarrow\) __init__

  • we an override __new__ actually

  • there’s no new operator in Python

When invoked, a class runs its new method to create an instance, then init to initialize it, and finally the instance is returned to the caller. Because there is no new operator in Python, calling a class is like calling a function.2

6) callable: class instances #

if we define a call method in the class

7) callable: Generator Functions #

  • they have yield in the function body – when called, they return a generator object

Generator functions Functions or methods that use the yield keyword in their body. When called, they return a generator object.

8) callable: native coroutine functions #

  • functions that are async def \(\rightarrow\) when called, they return a coroutine object

Native coroutine functions Functions or methods defined with async def. When called, they return a coroutine object.

9) callable: async generator functions #

  • have a yield -> returns generators
  • are async def -> generators are to be used with async for

Asynchronous generator functions Functions or methods defined with async def that have yield in their body. When called, they return an asynchronous generator for use with async for. Added in Python 3.6.

User-Defined Callable Types #

  • this relates to class instances that work as callable objects.

  • usecases:

    the main idea is really to have objects that can store their own internal state and are callable.

    Here’s two concrete cases:

    1. A class implementing __call__ is an easy way to create function-like objects that have some internal state that must be kept across invocations, like the remaining items in the BingoCage

    2. Decorators that need to remember state between calls of the decorator e.g. the @cache decorator

      this is also useful to split a complex implementation into staged, multi-decorator implementations.

      we’re familiar with this from using frameworks like flask or django where functions can have many decorators applies that does some pre-processing.

      • Practical Takeaways

        • Mental Model: Think of the decorator stack as a pipeline: each decorator acts before or after your main logic, transforming input, output, or context as needed.

        • Framework idioms: Nearly every Flask or Django extension adds its value via decorators to manage permissions, caching, rate limits, etc.

        • Custom use: You can easily define your own decorators to factor out repeated steps in your app for preprocessing (e.g., parameter parsing), postprocessing (e.g., formatting responses), or injecting cross-cutting security checks.

        In summary, the practice of splitting implementation using decorators (as in Flask/Django) is a core idiom for composing, reusing, and organizing web application logic in Python frameworks—and is a concrete, high-impact use of the broader decorator pattern described previously

        elaboration on the splitting up part:

               Splitting a concrete implementation into separate methods in Python using decorators typically refers to using decorators as a modular way to extend, preprocess, or postprocess your functions—effectively breaking up a monolithic function into composable steps. Decorators allow you to separate concerns and add reusable behaviors before or after the core logic, without modifying the original function directly.
        
               ## Key Concepts
        
               - **Decorators** are functions (or classes) that take another function/method, wrap it with additional behavior, and return the wrapped function.
        ​       - This is useful for **"splitting" implementation concerns**: e.g., one decorator handles logging, another argument validation, another result transformation, etc.
        ​       - Using multiple decorators, you can stack behaviors, creating a pipeline of processing stages for your function[1][5].
        
               ## Example: Splitting String Processing Steps
        
               Suppose you have a function that returns a string and you want to:
               1. Convert the result to uppercase.
               2. Then split it into words.
        
               You can write two decorators and apply them:
        
               ```python
               import functools
        
               def uppercase_decorator(func):
                   @functools.wraps(func)
                   def wrapper(*args, **kwargs):
                       result = func(*args, **kwargs)
                       return result.upper()
                   return wrapper
        
               def split_string_decorator(func):
                   @functools.wraps(func)
                   def wrapper(*args, **kwargs):
                       result = func(*args, **kwargs)
                       return result.split()
                   return wrapper
        
               @split_string_decorator
               @uppercase_decorator
               def greet():
                   return 'hello world'
        
               print(greet())  # Output: ['HELLO', 'WORLD']
               ```
        
               Here, the function’s logic is split: first uppercase conversion, then string splitting—each in its own decorator[1][5].
        
               ## Why Use This Pattern?
        
               - **Separation of concerns**: Each decorator encapsulates a single aspect, making your code easier to maintain, test, and reuse[1][6][9].
        ​       - **Composable behaviors**: You can stack/reorder behaviors as needed.
        ​       - **Cleaner implementation**: The core function stays focused on its business logic.
        
               ## Common Uses
        
               - **Argument or result preprocessing/postprocessing** (e.g., stripping strings, timing execution, error handling)[7].
        ​       - **Cross-cutting concerns** like authentication, logging, caching.
        ​       - **Extending methods in classes** with shared behaviors (see below caveat when decorating class methods)[3][4][8].
        
               ## Important Caveats
        
               - When applying decorators to class methods, the decorator receives either a function or a method, depending on implementation. You may need to ensure your decorator handles `self`/`cls` correctly, or else prefer using `functools.wraps` and testing thoroughly[3][4][8].
        ​       - The order of decorator application matters, especially if each processes return values or arguments—always document and manage dependencies between them[1][5].
        
               ## Summary Table
        
               | Use Case         | Approach                      | Notes                           |
               |------------------|------------------------------|---------------------------------|
               | Preprocessing    | Decorator before function    | E.g., strip, validate inputs    |
               | Postprocessing   | Decorator after function     | E.g., format, cache results     |
               | Cross-cutting    | Multiple stacked decorators  | E.g., logging + auth + cache    |
        
               In summary, splitting a concrete implementation using decorators lets you modularize, extend, and compose functionality in clear, reusable units[1][6][9]. This is a foundational pattern in Python for writing clean, maintainable, and flexible code.
        
               [1] https://www.datacamp.com/tutorial/decorators-python
               [2] https://www.thepythoncodingstack.com/p/demystifying-python-decorators
               [3] https://stackoverflow.com/questions/62857278/different-behavior-when-implementing-a-decorator-in-python-with-a-function-or-a
               [4] http://blog.dscpl.com.au/2014/01/how-you-implemented-your-python.html
               [5] https://www.youtube.com/watch?v=NyHuBGqCBII
               [6] https://www.codingem.com/what-are-python-decorators/
               [7] https://stackoverflow.com/questions/13698549/python-decorator-for-function-argument-preprocessing
               [8] https://github.com/GrahamDumpleton/wrapt/blob/develop/blog/10-performance-overhead-when-applying-decorators-to-methods.md
               [9] https://itnext.io/python-decorators-demystified-cef81d7e986d
               [10] https://refactoring.guru/design-patterns/decorator/python/example
        

        elaboration linked to flask / django:

               Your observation is highly relevant: **Flask and Django make extensive use of Python decorators to split and modularize concrete implementations of web functionality.** This pattern is foundational to how these frameworks are built and used.
        
               ### Contextualized Explanation with Flask/Django
        
               - **In Flask:** Every view (endpoint) is just a function, and decorators are used to transparently inject additional behavior before or after your function executes. The classic example is `@app.route`, which registers the function as a route handler. Beyond routing, decorators are used extensively for **authentication (`@login_required`), templating, caching, API argument validation, CORS, response formatting, and more**.[1][2][5]
        
                   - For example, a "login required" decorator checks authentication before the underlying view runs:
                   ```python
                   from functools import wraps
                   from flask import g, request, redirect, url_for
        
                   def login_required(f):
                       @wraps(f)
                       def decorated_function(*args, **kwargs):
                           if g.user is None:
                               return redirect(url_for('login', next=request.url))
                           return f(*args, **kwargs)
                       return decorated_function
        
                   @app.route('/secret')
                   @login_required
                   def secret_view():
                       ...
                   ```
                   Here, **the authentication check is split out from your business logic** into a reusable decorator[5].
        
                   - Another example, templating can be split using a decorator. Rather than explicitly rendering a template in every view, you can return just the context dictionary, and a decorator handles the rendering:
                   ```python
                   def templated(template):
                       def decorator(f):
                           @wraps(f)
                           def decorated_function(*args, **kwargs):
                               ctx = f(*args, **kwargs)
                               return render_template(template, **ctx)
                           return decorated_function
                       return decorator
        
                   @app.route('/')
                   @templated('index.html')
                   def index():
                       return dict(value=42)
                   ```
                   This **splits view organization (data preparation) from view rendering**[1][5].
        
               - **In Django:** The pattern is similar, although Django adds a distinction between **middleware** (which wraps the whole request/response pipeline) and **decorators** (which wrap only specific views).
        ​           - Decorators like `@login_required`, `@require_POST`, `@csrf_exempt`, etc., **modularize cross-cutting concerns** so they can be composed and reused across multiple views.
        ​           - Django middleware shares some similarities with decorators, acting at a broader scope[4].
        
               ### Why Frameworks Encourage This
        
               - **Separation of concerns:** Decorators allow you to extract generic, orthogonal logic (like permissions, logging, or caching) from the core business function, improving maintainability and testability.
        ​       - **Composable behaviors:** By stacking multiple decorators (order matters), you can chain checks, transformations, or wrappers around your views in a clear, declarative fashion[2].
        ​       - **Minimizing boilerplate:** You avoid repetition—rather than implementing access checks, error handling, or formatting logic in each view, you write these as reusable decorators once.
        
               ### Practical Takeaways
        
               - **Mental Model:** Think of the decorator stack as a pipeline: each decorator acts before or after your main logic, transforming input, output, or context as needed.
        ​       - **Framework idioms:** Nearly every Flask or Django extension adds its value via decorators to manage permissions, caching, rate limits, etc.
        ​       - **Custom use:** You can easily define your own decorators to factor out repeated steps in your app for preprocessing (e.g., parameter parsing), postprocessing (e.g., formatting responses), or injecting cross-cutting security checks.
        
               In summary, **the practice of splitting implementation using decorators (as in Flask/Django) is a core idiom for composing, reusing, and organizing web application logic in Python frameworks**—and is a concrete, high-impact use of the broader decorator pattern described previously[1][2][5].
        
               ---
               [1]: View Decorators — Flask Documentation (3.1.x)
               [2]: Advanced patterns for views and routing - Explore Flask
               [4]: Are middlewares an implementation of the Decorator ...
               [5]: 12. View Decorators
        
               [1] https://flask.palletsprojects.com/en/stable/patterns/viewdecorators/
               [2] http://explore-flask.readthedocs.org/en/latest/views.html
               [3] https://www.geeksforgeeks.org/python/decorator-method-python-design-patterns/
               [4] https://stackoverflow.com/questions/48696631/are-middlewares-an-implementation-of-the-decorator-pattern
               [5] https://tedboy.github.io/flask/patterns/viewdecorators.html
               [6] https://flask.palletsprojects.com/en/stable/design/
               [7] https://moldstud.com/articles/p-flask-and-restful-design-patterns-a-practical-overview-for-developers
               [8] https://stackoverflow.com/questions/48248771/design-patterns-for-flask-api-implementation/63376123
               [9] https://python.plainenglish.io/the-magic-of-python-decorators-from-basics-to-advanced-techniques-94d25dc5f920
               [10] https://dev.to/senichimaro/flask-decorator-something-57gd
        

checking if something is callable #

just use the callable() builitn and it will give us a boolean on whether it’s callable

for custom obj to be callable, just have to implement __call__ #

arbitrary Python objects may also be made to behave like functions. Implementing a call instance method is all it takes.

  • usecase - example use case could be having an alias to a default func within the class e.g. bingo.pick() and bingo()

usecase - implementing decorators that hold state across decorator invocation #

:NOTER_PAGE: (270 0.4916420845624386 . 0.35096774193548386)

Another good use case for call is implementing decora‐ tors. Decorators must be callable, and it is sometimes convenient to “remember” something between calls of the decorator (e.g., for memoization—caching the results of expensive computations for later use) or to split a complex implementation into separate methods. The functional approach to creating functions with internal state is to use closures. Closures, as well as decorators, are the subject of Chapter 9.

closures are the functional approach to having functions with internal state (in contrast to decorators being used for this same purpose) #

KIV this for a future chapter (chapter 9)

From Positional to Keyword-Only Parameters #

Python’s argument declaration features.

This part is about argument captures when defining functions.

There’s two types of interesting ways to pass variadic params:

  1. positional *args
  2. keyword based **kwargs

NOTE: the usual params can always be passed in keyword fashion (unless it’s positional-only)

Keyword-only parameters #

There’s two types of argument passing to consider when writing functions:

variable positional args #

  • accepting multiple positional arguments as tuple using *<arg_name>

  • this allows us to have variadic params

  • this gives us a positional tuple for these args

``def tag(name, *content, class_=None, **attrs):''

we can define a keyword-only argument signature for functions #

To specify keyword-only arguments when defining a function, name them after the argument prefixed with *.

  • if don’t want to support variable position args, then do def f(a, *, b):

Keyword-only arguments are a feature of Python 3. In Example 7-9, the class_ parameter can only be given as a keyword argument—it will never capture unnamed positional arguments. To specify keyword-only arguments when defining a function, name them after the argument prefixed with *. If you don’t want to support variable positional arguments, then just do this: def f(a, *, b):

Positional-Only Parameters (using /) #

defining position-only params to a function #

e.g. def divmod(a,b,/)

after the / we can specify args as per usual

To define a function requiring positional-only parameters, use / in the parameter list. This example from “What’s New In Python 3.8” shows how to emulate the divmod built-in function: def divmod(a, b, ): return (a / b, a % b) All arguments to the left of the / are positional-only. After the /, you may specify other arguments, which work as usual.

Packages for Functional Programming #

The intent wasn’t to support the typical functional paradigms but there is support.

The operator module is for useful operations, the functools modules is for useful higher order functions.

functools also has some higher order functions that can be used as decorators (e.g. cache, singledispatch)

The operator Module #

  • the operator module has a bunch of callable functions that we can use as params for higher order functions.

  • TO_HABIT: use itemgetter and attrgetter more when accessing nested objects!

provides function equivalents to operators for trivial things #

These are callable functions, e.g. they are

  • itemgetter: Essentially, itemgetter(1) creates a function that, given a collection, returns the item at index 1. That’s easier to write and read than lambda fields: fields[1], which does the same thing.

  • attrgetter: attrgetter supports nested accesses

    :NOTER_PAGE: (275 0.24311023622047243 . 0.09819121447028424)

    A sibling of itemgetter is attrgetter, which creates functions to extract object attributes by name. If you pass attrgetter several attribute names as arguments, it also returns a tuple of values. In addition, if any argument name contains a . (dot), attrgetter navigates through nested objects to retrieve the attribute.

    attrgetter demo

    :NOTER_PAGE: (275 0.4183070866141732 . 0.09819121447028424)

Example 7-14. Demo of attrgetter to process a previously defined list of namedtuple called metro_data (the same list that appears in Example 7-13) >>> from collections import namedtuple >>> LatLon = namedtuple(‘LatLon’, ’lat lon’) >>> Metropolis = namedtuple(‘Metropolis’, ’name cc pop coord’) >>> metro_areas = [Metropolis(name, cc, pop, LatLon(lat, lon)) … for name, cc, pop, (lat, lon) in metro_data] >>> metro_areas[0] Metropolis(name=‘Tokyo’, cc=‘JP’, pop=36.933, coord=LatLon(lat=35.689722, lon=139.691667)) >>> metro_areas[0].coord.lat 35.689722 >>> from operator import attrgetter >>> name_lat = attrgetter(’name’, ‘coord.lat’) >>> >>> for city in sorted(metro_areas, key=attrgetter(‘coord.lat’)): … print(name_lat(city)) … (‘São Paulo’, -23.547778) (‘Mexico City’, 19.433333) (‘Delhi NCR’, 28.613889) (‘Tokyo’, 35.689722) (‘New York-Newark’, 40.808611)

also can refer to this perplexity overview of what attrgetter can do:

The `operator.attrgetter` function in Python is a convenient way to create a callable that retrieves attributes from objects. It can be particularly useful when working with deeply nested dictionaries or objects. Below are examples demonstrating how to use `operator.attrgetter` for accessing values within a deeply nested dictionary structure.

### Example 1: Accessing Nested Attributes in Objects

Suppose you have a class structure representing a person and their address:

```python
from operator import attrgetter

class Address:
    def __init__(self, city, state):
        self.city = city
        self.state = state

class Person:
    def __init__(self, name, address):
        self.name = name
        self.address = address

# Create instances of Address and Person
address1 = Address("New York", "NY")
address2 = Address("Los Angeles", "CA")

person1 = Person("Alice", address1)
person2 = Person("Bob", address2)

# List of persons
people = [person1, person2]

# Use attrgetter to access nested attributes
get_city = attrgetter('address.city')
cities = list(map(get_city, people))

print(cities)  # Output: ['New York', 'Los Angeles']
```

### Example 2: Accessing Nested Values in a Dictionary

If you have a nested dictionary structure, you can use `attrgetter` to retrieve values from it. Here’s an example:

```python
from operator import itemgetter

# Nested dictionary structure
data = [
    {"name": "Alice", "location": {"city": "New York", "state": "NY"}},
    {"name": "Bob", "location": {"city": "Los Angeles", "state": "CA"}},
]

# Accessing the city using itemgetter for nested dictionaries
get_city = itemgetter('location')
cities = [itemgetter('city')(get_city(person)) for person in data]

print(cities)  # Output: ['New York', 'Los Angeles']
```

### Example 3: Combining Multiple Attributes

You can also use `attrgetter` to fetch multiple attributes at once. For instance, if you want to get both the city and state from the nested structure:

```python
from operator import attrgetter

class Address:
    def __init__(self, city, state):
        self.city = city
        self.state = state

class Person:
    def __init__(self, name, address):
        self.name = name
        self.address = address

# Create instances of Address and Person
address1 = Address("New York", "NY")
address2 = Address("Los Angeles", "CA")

person1 = Person("Alice", address1)
person2 = Person("Bob", address2)

# List of persons
people = [person1, person2]

# Use attrgetter to access both city and state
get_location = attrgetter('address.city', 'address.state')
locations = list(map(get_location, people))

print(locations)  # Output: [('New York', 'NY'), ('Los Angeles', 'CA')]
```

### Summary

- **`operator.attrgetter`**: This function is useful for creating callables that can retrieve attributes from objects. It works well with nested structures.
- **Accessing Nested Attributes**: You can easily access deeply nested attributes using `attrgetter`, which simplifies your code and improves readability.
- **Combining Attributes**: You can retrieve multiple attributes at once by passing them as arguments to `attrgetter`.

These examples illustrate how `operator.attrgetter` can be effectively used for accessing nested attributes in both object-oriented and dictionary-based data structures in Python.

Citations:
[1] https://florian-dahlitz.de/articles/introduction-to-pythons-operator-module
[2] https://note.nkmk.me/en/python-operator-usage/
[3] https://teamtreehouse.com/community/keyitemgetterattrgetter
[4] https://discuss.python.org/t/operator-attrgetter-itemgetter-for-setting/20544
[5] https://www.digitalocean.com/community/tutorials/how-to-use-args-and-kwargs-in-python-3
[6] https://stackoverflow.com/questions/61452346/python-attrgetter-that-handles-none-values-and-can-be-used-in-a-loop
[7] https://www.kdnuggets.com/2023/02/python-function-arguments-definitive-guide.html
[8] https://www.youtube.com/watch?v=on77DvsVgrI

methodcaller vs defining a lambda for it #

``Example 7-15. Demo of methodcaller: second test shows the binding of extra’’

  • usecases:

    1. it allows us to partially freeze some args to an existing function

      e.g. hyphenate = methodcaller('replace', ' ', '-')

  • methodcaller is likely better than using a lambda because it’s more like a pointer to a callable function, with args as opposed to a new lambda being defined in each execution in the case of using lambdas

  • references: SO post and the perplexity dump below

The `operator.methodcaller` function in Python provides a way to create a callable that invokes a method on an object, using the method's name specified as a string. This can be particularly useful in scenarios where you need to call the same method repeatedly on different objects or when working with higher-order functions like `map` or `filter`.

### Value Proposition of Using `operator.methodcaller`

1. **Readability**: Using `methodcaller` can improve readability by clearly indicating that you are calling a method, especially when the method name is dynamic or when you are applying it across multiple items.

2. **Performance**: `methodcaller` can be more efficient than using a lambda function because it avoids the overhead of creating a new function object for each call. Instead, it creates a single callable that can be reused.

3. **Simplicity**: It simplifies code when you need to apply the same method call to multiple objects, making your intentions clearer without having to define multiple lambda functions.

4. **Consistency**: It provides a consistent interface for calling methods, especially when dealing with collections of objects where the method names might be determined at runtime.

### Example Usage of `operator.methodcaller`

Here's how you can use `operator.methodcaller` in practice:

#### Example 1: Basic Usage

```python
from operator import methodcaller

# Define a list of strings
strings = ["hello", "world", "python"]

# Use methodcaller to call the upper() method on each string
upper_case_strings = list(map(methodcaller('upper'), strings))

print(upper_case_strings)  # Output: ['HELLO', 'WORLD', 'PYTHON']
```

In this example, `methodcaller('upper')` creates a callable that converts strings to uppercase. The `map` function applies this callable to each string in the list.

#### Example 2: Calling Methods with Arguments

You can also use `methodcaller` to call methods that require arguments:

```python
from operator import methodcaller

# Define a list of strings
strings = ["hello", "world", "python"]

# Use methodcaller to call the rjust() method on each string
right_justified_strings = list(map(methodcaller('rjust', 10, '.'), strings))

print(right_justified_strings)  # Output: ['.....hello', '.....world', '....python']
```

In this case, `methodcaller('rjust', 10, '.')` creates a callable that right-justifies each string to a width of 10 characters, filling with dots.

### Comparison with Lambda Functions

While you could achieve similar functionality using lambda functions, there are notable differences:

- **Performance**: As mentioned earlier, using `methodcaller` can be faster than using a lambda because it avoids the overhead associated with creating new function objects for each call.

- **Conciseness**: For simple method calls, `methodcaller` can make your code cleaner and more concise compared to writing out lambda expressions.

- **Readability**: Using `methodcaller` explicitly indicates that you're invoking a method, which may enhance code clarity compared to lambda functions that might require additional context to understand what they do.

### Conclusion

Using `operator.methodcaller` provides several advantages over defining lambdas for method calls, including improved readability, performance benefits, and simplicity. It is particularly useful in functional programming contexts where you need to apply methods across collections of objects efficiently and clearly.

Citations:
[1] https://stackoverflow.com/questions/2705104/lambda-vs-operator-attrgetterxxx-as-a-sort-key-function
[2] https://florian-dahlitz.de/articles/introduction-to-pythons-operator-module
[3] https://martinheinz.dev/blog/54
[4] https://treyhunner.com/2018/09/stop-writing-lambda-expressions/
[5] https://note.nkmk.me/en/python-operator-usage/
[6] https://realpython.com/python-operator-module/
[7] https://news.ycombinator.com/item?id=28104674
[8] https://discuss.python.org/t/operator-attrgetter-itemgetter-for-setting/20544

Freezing Arguments with functools.partial #

it’s a partial application of another function, where we freeze some params

e.g. nfc = functools.partial(unicodedata.normalize, 'NFC')

Another is partial: given a callable, it produces a new callable with some of the arguments of the original callable bound to predetermined values. This is useful to adapt a function that takes one or more arguments to an API that requires a callback with fewer arguments.

Chapter Summary #

  • main ideas are that you can assign functions to variables, pass them to other functions, store them in data structures, and access function attributes, allowing frameworks and tools to act on that information.

  • Python uses a statement-oriented syntax in which expressions cannot contain statements, and many language constructs are statements —including try/catch, which is what I miss most often when writing lambdas. This is the price to pay for Python’s highly readable syntax.

  • it’s a good thing that the “functional features” are limited in python because it helps make the language easier to use.

NOTE: python doesn’t do tail recursion optimisation (TCO) #

reasons:

  1. hard to do stack tracing, makes it harder to debug

  2. not aligned with python’s idioms

    • typically TCO necessary to avoid stack overflow, in languages that are fundamentally functional where deep recursion is idiomatic
  3. hard to implement for the python interpreter.

    current interpreter is supposed to :

    • allow for dyanmic typing
    • allow for introspection

    it’s hard to implement the TCO stuff AND also follow these principles.

biggest obstacle to wider adop‐ tion of functional programming idioms in Python is the lack of tail-call elimination, an optimization that allows memory-efficient computation of a function that makes a recursive call at the “tail” of its body. In another blog post, “Tail Recursion Elimina‐ tion”, Guido gives several reasons why

here’s a more comprehensive perplexity output:

Python does not implement tail recursion optimization (TCO), meaning recursive functions—even if tail-recursive—consume a new stack frame for each call, and will hit the recursion depth limit rather than looping indefinitely. This is a deliberate design decision, most famously articulated by Python’s creator, Guido van Rossum[2][3][4][8][10].

### Guido van Rossum’s Reasoning

Guido has explicitly written about why Python doesn’t and won’t adopt TCO:
- **Readability and Debuggability:** A major reason is that Python values clear tracebacks for debugging. If TCO were implemented, the call stack would no longer show the chain of tail-recursive calls, since repeated “tail” calls would overwrite each other instead of expanding the stack. This would make it much harder to trace the logical path that led to an error[2][3].
- **Not Essential for Python’s Intended Use:** In languages that are fundamentally functional (like Scheme), deep recursion is idiomatic, and TCO is necessary to avoid stack overflow. In Python, the expectation is that if you need repeated iteration, you should use explicit loops, which are readable and efficient. Recursion in Python is more commonly used for problems where the recursion depth will be limited in practice, such as tree traversal or parsing, rather than as a substitute for iteration[2][4][8][10].
- **Implementation Complexity:** Due to the dynamic, introspective nature of Python, reliably transforming all tail calls into jumps is technically challenging and would complicate the interpreter. Guido also noted that you can't always know at compile time what function or method is actually being called due to Python’s dynamic features[2].

### Community Patterns and Workarounds

- Python coders generally prefer explicit loops over deep recursion for iterative processes.
- Some clever decorators and tricks (often involving stack frame introspection and exception handling) can mimic TCO in restricted circumstances, but these are not supported officially and can introduce their own overhead or edge cases[5][7].
- The maximum recursion depth in standard Python is limited (e.g., 1,000 frames), highlighting that Python is not optimized for deep recursion by design[5][10].

### Guido’s Perspective (from “Tail Recursion Elimination”)

> "Tail recursion elimination is necessary in functional languages with no side effects, like Scheme, but not in a language with explicit state and tracebacks like Python. Tail recursion requires changing semantics that people may not expect, and it destroys stack traces." [2]

### Summary Table

| Language         | Tail Recursion Optimization? | Reason                                                        |
|------------------|-----------------------------|---------------------------------------------------------------|
| Python           | No                          | Preserves debuggability, prefers explicit loops               |
| Scheme/Lisp      | Yes                         | Recursion is idiomatic, required for efficient iteration      |
| Haskell          | Yes                         | Functional style, deep recursion common                       |

In summary, Python intentionally does not optimize tail calls to preserve debuggability, consistency, and code clarity. This is a conscious part of the language philosophy articulated by Guido and is unlikely to change[2][3][4][10].

[1] https://en.wikipedia.org/wiki/Tail_call
[2] http://neopythonic.blogspot.com/2009/04/tail-recursion-elimination.html
[3] https://stackoverflow.com/questions/13591970/does-python-optimize-tail-recursion
[4] https://www.reddit.com/r/learnpython/comments/q2oggt/does_python_support_tailrecursion/
[5] https://riptutorial.com/python/example/17857/tail-recursion-optimization-through-stack-introspection
[6] https://stackoverflow.com/questions/37193076/why-is-tail-recursion-optimization-faster-than-normal-recursion-in-python
[7] https://stackoverflow.com/questions/27417874/tail-recursion-optimization-decorator-in-python
[8] https://www.designgurus.io/answers/detail/does-python-optimize-tail-recursion
[9] https://www.geeksforgeeks.org/dsa/why-is-tail-recursion-optimization-faster-than-normal-recursion/
[10] https://www.geeksforgeeks.org/dsa/tail-recursion-in-python/

Further Reading #