Reputation: 1764
Is there any advantage to using the 'type hint' notation in python?
import sys
def parse(arg_line: int) -> str:
print (arg_line) # passing a string, returning None
if __name__ == '__main__':
parse(' '.join(sys.argv[1:]))
To me it seems like it complicates the syntax without providing any actual benefit (outside of perhaps within a development environment). Based on this:
type
constraints within the language itself?I also don't see this much in the python codebase itself as far as I can tell -- most types are enforced manually, for example: argparse.py
and any other files I've glanced at in https://github.com/python/cpython/blob/3.7/Lib/.
Upvotes: 27
Views: 28684
Reputation: 365
Maybe beartype
is what you want. See beartype documentation and beartype GitHub repo.
Beartype is an open-source pure-Python PEP-compliant near-real-time hybrid runtime-static third-generation type-checker emphasizing efficiency, usability, unsubstantiated jargon we just made up, and thrilling puns.
Beartype enforces type hints across your entire app in two lines of runtime code with no runtime overhead. If seeing is believing, prepare to do both those things.
# Install beartype.
$ pip3 install beartype
# At the very top of your "{your_package}.__init__" submodule:
from beartype.claw import beartype_this_package # <-- boilerplate for victory
beartype_this_package() # <-- yay! your team just won
Beartype now implicitly type-checks all annotated classes, callables, and variable assignments across all submodules of your package. Congrats. This day all bugs die.
Upvotes: 2
Reputation: 106523
"Possible to enforce type hints?" The short answer is NO. It is not possible.
The long answer is that it is technically impossible for Python to enforce type hints in a generic fashion because instances of certain types would change state just by validating them.
Consider the following function that joins an iterable of strings into one string:
def join(iterable: Iterable[str]) -> str:
return ''.join(iterable)
Now, at runtime, how can a supposed type enforcer validate the iterable
argument as a proper Iterable[str]
? "It's easy," one may say, "just iterate over the iterable and validate that the items generated are all strings!"
So we can write an Iterable[str]
enforcer as a function decorator:
def enforce_iterable_str(func):
def wrapper(arg):
try:
for i in arg:
if not isinstance(i, str):
raise TypeError
except TypeError as e:
raise TypeError('Argument must be an iterable of strings.') from e
return func(arg)
return wrapper
@enforce_iterable_str
def join(iterable: Iterable[str]) -> str:
return ''.join(iterable)
print(join(['a', 'b', 'c'])) # outputs abc
Looks good, until you pass to it an iterable that changes state when iterated over:
print(join(iter(['a', 'b', 'c']))) # outputs nothing
The iterator is consumed by the type enforcer so it has nothing left for the actual body of the function to process.
The same issue applies to typing.IO[bytes]
, where it is unknown at runtime what type of data reading from an IO stream would return until the type enforcer actually reads from the stream, at which point there is less content from the stream for the function body to read.
This is the hard reason why it is technically impossible to implement a general runtime type enforcer.
Upvotes: 2
Reputation: 1267
Short answer is that there is no plan for standard support, as stated in the other answers. One alternative which I do not see mentioned here is pydantic
.
The BaseModel
from pydantic
is essentially a python data class that performs type validation in its constructor (and also has handy methods to validate/unmarshal from json buffers). One definitely noteworthy feature for your specific question.... if you pass a string that can be parsed to the correct type (IE '1' -> int | '1.5' -> float
), pydantic will only throw an error if the value cannot be coerced into the correct type, and otherwise perform the conversion for you (this can be subverted using strict types or by setting strict=True
and using BaseModel.model_validate
).
It's not like type_enforced
where it will just use your annotations/hints (by wrapping your function in another function that checks them), you'll still end up using the BaseModel
constructor/validators to enforce your types IE:
from pydantic import BaseModel,ValidationError
from typing import Union
from json import dumps as to_json
class Foo(BaseModel):
a: int
b: Union[int,str] = 2
c: int = 3
def my_fn(foo: Foo) -> None:
pass
good: Foo = Foo(a=1,b=2,c=3)
also_good: Foo = Foo(a='1',b=2,c='3')
good_dict: dict = {"a":1,"b":2,"c":3}
bad_dict_strict: dict = {"a":'1',"b":2,"c":'3'}
good_dict_json: str = to_json(good_dict)
from_good_dict: Foo = Foo(**good_dict)
from_good_dict_json: Foo = Foo.model_validate_json(good_dict_json)
try:
bad: Foo = Foo(a='b',b='a',c=1)
except ValidationError as ve:
print(ve)
try:
bad_strict: Foo = Foo.model_validate(bad_dict_strict,strict=True)
except ValidationError as ve:
print(ve)
my_fn(good)
my_fn(also_good)
my_fn(from_good_dict)
my_fn(from_good_dict_json)
Upvotes: 3
Reputation: 1470
One option to take advantage of type hints is the type_enforced module. It runs in pure python on the fly with no external compiler needed. Regarding official python support, it still seems unlikely that types hints will be enforced directly in the near future.
Going into type_enforced
, the package allows you to take advantage of type hints. It supports both input and output typing. Only types that are specified are enforced. Multiple possible inputs are also supported so you can specify something like int or float.
Input types are first validated (lazily on the function call) and if valid, the function is processed where the return value is then validated.
There are some limitations such that nested type structures are not supported. For example you can not specify type as a list of integers, but only a list. You would need to validate the items in the list inside of your function. Update: In v1.0.0 Additional support for the typing package, nested type structures and even Optional
s was added.
pip install type_enforced
>>> import type_enforced
>>> @type_enforced.Enforcer
... def my_fn(a: int , b: [int, str] =2, c: int =3) -> None:
... pass
...
>>> my_fn(a=1, b=2, c=3)
>>> my_fn(a=1, b='2', c=3)
>>> my_fn(a='a', b=2, c=3)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/conmak/development/personal/type_enforced/type_enforced/enforcer.py", line 47, in __call__
return self.__validate_types__(*args, **kwargs)
File "/home/conmak/development/personal/type_enforced/type_enforced/enforcer.py", line 83, in __validate_types__
self.__check_type__(assigned_vars.get(key), value, key)
File "/home/conmak/development/personal/type_enforced/type_enforced/enforcer.py", line 56, in __check_type__
self.__exception__(
File "/home/conmak/development/personal/type_enforced/type_enforced/enforcer.py", line 37, in __exception__
raise TypeError(f"({self.__fn__.__qualname__}): {message}")
TypeError: (my_fn): Type mismatch for typed function (my_fn) with `a`. Expected one of the following `[<class 'int'>]` but got `<class 'str'>` instead.
Upvotes: 12
Reputation: 7111
Are there any plans for python to contain type constraints within the language itself?
Almost certainly not, and definitely not before the next major version (4.x).
What is the advantage of having a "type hint" ? Couldn't I just as easily throw that into the docstring or something?
Off the top of my head, consider the following:
mypy
.foo(
, the IDE can pick up on the type hints and display a box nearby that shows foo(x: int, y: List[int])
. The advantage to you as a developer is that you have exactly the information you need exposed to you and don't have to munge an entire docstring.functools.singledispatch
or external libraries like multipledispatch
to add additional type-related features (in this case, dispatching function calls based on name and type, not just name).Upvotes: 21