Reputation: 3855
Function Annotations: PEP-3107
I ran across a snippet of code demonstrating Python3's function annotations. The concept is simple but I can't think of why these were implemented in Python3 or any good uses for them. Perhaps SO can enlighten me?
How it works:
def foo(a: 'x', b: 5 + 6, c: list) -> max(2, 9):
... function body ...
Everything following the colon after an argument is an 'annotation', and the information following the ->
is an annotation for the function's return value.
foo.func_annotations would return a dictionary:
{'a': 'x',
'b': 11,
'c': list,
'return': 9}
What's the significance of having this available?
Upvotes: 173
Views: 61440
Reputation: 2098
I could be wrong but one of the possible benefits is separation of concerns!
I could leave
(a) the information on args, kwargs and return value's type to function annotations (so that the information is easily visible in modern IDEs when trying to use them and also in documentations build with tools like Sphinx)
and
(b) the description of the purpose (along with any algorithmic or implementation details) of the function/method to the doc strings without involving the type hints here.
Below is a very simple example code where this is achieved.
"""
Possible benefit of annotations
"""
def f1_return_a_float(a: float, b: int)->float:
"""
returns sum of two numbers typecasted to float
"""
return float(a+b)
if __name__ == "__main__":
print(f1_return_a_float.__doc__)
print(f1_return_a_float.__annotations__)
As you can see the doc and annotation give separate and complementary details on the code. Please do not forget the nice help IDE gives when you are trying to call the function. It is better than finding one's way through the docstring.
It looks better on the generated sphinx file as well.
Upvotes: 0
Reputation: 4244
This is a way late answer, but AFAICT, the best current use of function annotations is PEP-0484 and MyPy. There's also PyRight from Microsoft which is used by VSCode and also available via CLI.
Mypy is an optional static type checker for Python. You can add type hints to your Python programs using the upcoming standard for type annotations introduced in Python 3.5 beta 1 (PEP 484), and use mypy to type check them statically.
Used like so:
from typing import Iterator
def fib(n: int) -> Iterator[int]:
a, b = 0, 1
while a < n:
yield a
a, b = b, a + b
Upvotes: 49
Reputation: 89749
I think this is actually great.
Coming from an academic background, I can tell you that annotations have proved themselves invaluable for enabling smart static analyzers for languages like Java. For instance, you could define semantics like state restrictions, threads that are allowed to access, architecture limitations, etc., and there are quite a few tools that can then read these and process them to provide assurances beyond what you get from the compilers. You could even write things that check preconditions/postconditions.
I feel something like this is especially needed in Python because of its weaker typing, but there were really no constructs that made this straightforward and part of the official syntax.
There are other uses for annotations beyond assurance. I can see how I could apply my Java-based tools to Python. For instance, I have a tool that lets you assign special warnings to methods, and gives you indications when you call them that you should read their documentation (E.g., imagine you have a method that must not be invoked with a negative value, but it's not intuitive from the name). With annotations, I could technically write something like this for Python. Similarly, a tool that organizes methods in a large class based on tags can be written if there is an official syntax.
Upvotes: 96
Reputation: 7362
Python 3.X (only) also generalizes function definition to allow arguments and return values to be annotated with object values for use in extensions.
Its META-data to explain, to be more explicit about the function values.
Annotations are coded as :value
after the
argument name and before a default, and as ->value
after the
argument list.
They are collected into an __annotations__
attribute of the function, but are not otherwise treated as special by Python itself:
>>> def f(a:99, b:'spam'=None) -> float:
... print(a, b)
...
>>> f(88)
88 None
>>> f.__annotations__
{'a': 99, 'b': 'spam', 'return': <class 'float'>}
Source: Python Pocket Reference, Fifth Edition
EXAMPLE:
The typeannotations
module provides a set of tools for type checking and type inference of Python code. It also a provides a set of types useful for annotating functions and objects.
These tools are mainly designed to be used by static analyzers such as linters, code completion libraries and IDEs. Additionally, decorators for making run-time checks are provided. Run-time type checking is not always a good idea in Python, but in some cases it can be very useful.
https://github.com/ceronman/typeannotations
How Typing Helps to Write Better Code
Typing can help you do static code analysis to catch type errors before you send your code to production and prevent you from some obvious bugs. There are tools like mypy, which you can add to your toolbox as part of your software life cycle. mypy can check for correct types by running against your codebase partially or fully. mypy also helps you to detect bugs such as checking for the None type when the value is returned from a function. Typing helps to make your code cleaner. Instead of documenting your code using comments, where you specify types in a docstring, you can use types without any performance cost.
Clean Python: Elegant Coding in Python ISBN: ISBN-13 (pbk): 978-1-4842-4877-5
PEP 526 -- Syntax for Variable Annotations
https://www.python.org/dev/peps/pep-0526/
https://www.attrs.org/en/stable/types.html
Upvotes: 3
Reputation: 1246
As a bit of a delayed answer, several of my packages (marrow.script, WebCore, etc.) use annotations where available to declare typecasting (i.e. transforming incoming values from the web, detecting which arguments are boolean switches, etc.) as well as to perform additional markup of arguments.
Marrow Script builds a complete command-line interface to arbitrary functions and classes and allows for defining documentation, casting, and callback-derived default values via annotations, with a decorator to support older runtimes. All of my libraries that use annotations support the forms:
any_string # documentation
any_callable # typecast / callback, not called if defaulting
(any_callable, any_string) # combination
AnnotationClass() # package-specific rich annotation object
[AnnotationClass(), AnnotationClass(), …] # cooperative annotation
"Bare" support for docstrings or typecasting functions allows for easier mixing with other libraries that are annotation-aware. (I.e. have a web controller using typecasting that also happens to be exposed as a command-line script.)
Edited to add: I've also begun making use of the TypeGuard package using development-time assertions for validation. Benefit: when run with "optimizations" enabled (-O
/ PYTHONOPTIMIZE
env var) the checks, which may be expensive (e.g. recursive) are omitted, with the idea that you've properly tested your app in development so the checks should be unnecessary in production.
Upvotes: 1
Reputation: 160417
Despite all uses described here, the one enforceable and, most likely, enforced use of annotations will be for type hints.
This is currently not enforced in any way but, judging from PEP 484, future versions of Python will only allow types as the value for annotations.
Quoting What about existing uses of annotations?:
We do hope that type hints will eventually become the sole use for annotations, but this will require additional discussion and a deprecation period after the initial roll-out of the typing module with Python 3.5. The current PEP will have provisional status (see PEP 411 ) until Python 3.6 is released. The fastest conceivable scheme would introduce silent deprecation of non-type-hint annotations in 3.6, full deprecation in 3.7, and declare type hints as the only allowed use of annotations in Python 3.8.
Though I haven't seen any silent deprecations in 3.6 yet, this could very well be bumped to 3.7, instead.
So, even though there might be some other good use-cases, it is best to keep them solely for type hinting if you don't want to go around changing everything in a future where this restriction is in place.
Upvotes: 1
Reputation: 24662
Just to add a specific example of a good use from my answer here, coupled with decorators a simple mechanism for multimethods can be done.
# This is in the 'mm' module
registry = {}
import inspect
class MultiMethod(object):
def __init__(self, name):
self.name = name
self.typemap = {}
def __call__(self, *args):
types = tuple(arg.__class__ for arg in args) # a generator expression!
function = self.typemap.get(types)
if function is None:
raise TypeError("no match")
return function(*args)
def register(self, types, function):
if types in self.typemap:
raise TypeError("duplicate registration")
self.typemap[types] = function
def multimethod(function):
name = function.__name__
mm = registry.get(name)
if mm is None:
mm = registry[name] = MultiMethod(name)
spec = inspect.getfullargspec(function)
types = tuple(spec.annotations[x] for x in spec.args)
mm.register(types, function)
return mm
and an example of use:
from mm import multimethod
@multimethod
def foo(a: int):
return "an int"
@multimethod
def foo(a: int, b: str):
return "an int and a string"
if __name__ == '__main__':
print("foo(1,'a') = {}".format(foo(1,'a')))
print("foo(7) = {}".format(foo(7)))
This can be done by adding the types to the decorator as Guido's original post shows, but annotating the parameters themselves is better as it avoids the possibility of wrong matching of parameters and types.
Note: In Python you can access the annotations as function.__annotations__
rather than function.func_annotations
as the func_*
style was removed on Python 3.
Upvotes: 25
Reputation: 6185
If you look at the list of benefits of Cython, a major one is the ability to tell the compiler which type a Python object is.
I can envision a future where Cython (or similar tools that compile some of your Python code) will use the annotation syntax to do their magic.
Upvotes: -2
Reputation: 184
Annotations can be used for easily modularizing code. E.g. a module for a program which I'm maintaining could just define a method like:
def run(param1: int):
"""
Does things.
:param param1: Needed for counting.
"""
pass
and we could ask the user for a thing named "param1" which is "Needed for counting" and should be an "int". In the end we can even convert the string given by the user to the desired type to get the most hassle free experience.
See our function metadata object for an open source class which helps with this and can automatically retrieve needed values and convert them to any desired type (because the annotation is a conversion method). Even IDEs show autocompletions right and assume that types are according to annotations - a perfect fit.
Upvotes: -2
Reputation: 1811
It a long time since this was asked but the example snippet given in the question is (as stated there as well) from PEP 3107 and at the end of thas PEP example Use cases are also given which might answer the question from the PEPs point of view ;)
The following is quoted from PEP3107
Use Cases
In the course of discussing annotations, a number of use-cases have been raised. Some of these are presented here, grouped by what kind of information they convey. Also included are examples of existing products and packages that could make use of annotations.
See the PEP for more information on specific points (as well as their references)
Upvotes: 3
Reputation: 1983
The first time I saw annotations, I thought "great! Finally I can opt in to some type checking!" Of course, I hadn't noticed that annotations are not actually enforced.
So I decided to write a simple function decorator to enforce them:
def ensure_annotations(f):
from functools import wraps
from inspect import getcallargs
@wraps(f)
def wrapper(*args, **kwargs):
for arg, val in getcallargs(f, *args, **kwargs).items():
if arg in f.__annotations__:
templ = f.__annotations__[arg]
msg = "Argument {arg} to {f} does not match annotation type {t}"
Check(val).is_a(templ).or_raise(EnsureError, msg.format(arg=arg, f=f, t=templ))
return_val = f(*args, **kwargs)
if 'return' in f.__annotations__:
templ = f.__annotations__['return']
msg = "Return value of {f} does not match annotation type {t}"
Check(return_val).is_a(templ).or_raise(EnsureError, msg.format(f=f, t=templ))
return return_val
return wrapper
@ensure_annotations
def f(x: int, y: float) -> float:
return x+y
print(f(1, y=2.2))
>>> 3.2
print(f(1, y=2))
>>> ensure.EnsureError: Argument y to <function f at 0x109b7c710> does not match annotation type <class 'float'>
I added it to the Ensure library.
Upvotes: 14
Reputation: 226296
Function annotations are what you make of them.
They can be used for documentation:
def kinetic_energy(mass: 'in kilograms', velocity: 'in meters per second'):
...
They can be used for pre-condition checking:
def validate(func, locals):
for var, test in func.__annotations__.items():
value = locals[var]
msg = 'Var: {0}\tValue: {1}\tTest: {2.__name__}'.format(var, value, test)
assert test(value), msg
def is_int(x):
return isinstance(x, int)
def between(lo, hi):
def _between(x):
return lo <= x <= hi
return _between
def f(x: between(3, 10), y: is_int):
validate(f, locals())
print(x, y)
>>> f(0, 31.1)
Traceback (most recent call last):
...
AssertionError: Var: y Value: 31.1 Test: is_int
Also see http://www.python.org/dev/peps/pep-0362/ for a way to implement type checking.
Upvotes: 101
Reputation: 21089
Uri has already given a proper answer, so here's a less serious one: So you can make your docstrings shorter.
Upvotes: 24