Then you will not get this error (NameError: name 'Sequence' is not defined) because it will not need the reference to `Sequence` for the annotation but the annotation is simply a string.
Using `if TYPE_CHECKING` is also useful when you want to speed up the module load time by not importing unnecessary modules (unnecessary at module load time).
And then, also to resolve import cycles, where you still want to annotate the function argument types.
One problem though: If this is a type/object which is not needed for module load time (there it's just used to annotate some function arg type), but which will be needed inside the function, type checkers and IDEs (e.g. PyCharm) will not show any problem, but then at runtime, you will the NameError once it tries to use the type/object.
Example module:
from __future__ import annotations
from typing import TYPE_CHECKING
if TYPE_CHECKING:
import torch
def agi_model(query: torch.Tensor) -> torch.Tensor:
torch.setup_secret_agi()
...
This module will load just fine, because `torch` is only needed for type checking at module load time. However, once you call this `agi_model` function, it will fail because `torch` is actually unknown at runtime. However, no IDE will show any problem with this code. (Edit: Eclipse/PyDev handles it correctly now, see answer below.)
Then you would add another `import torch` inside that `agi_model` function. However, then the IDE might even complain that this hides the outer scope `torch`.
> However, no IDE will show any problem with this code.
That's not entirely True... I just checked it here and Eclipse/PyDev does flag this correctly -- since version 11.0.0 ;)
See: https://www.pydev.org/history_pydev.html (Imports found inside a typing.TYPE_CHECKING will be considered undefined if the scope that uses it requires it to be available when not type-checking).
It kind of works, you just have to use `typing.get_annotations(func)` instead of `func.__annotations__`. Although if you aren't importing the types used in your annotations that isn't going to work... I would guess that's rare in practice.
Not really. `import type` just means that this import is guaranteed to be removed when compiling to JS. TS already fully erases types during compilation, and this is just a way to guarantee that the imports of types are guaranteed to be erased as well.
E.g. for `import { SomeType } from "backend-server"`, you don't want to include your entire server code in the frontend just because you imported the response type of some JSON API. `import type` neatly solves this issue, and it even enforces that you can't import anything but types from your backend.
I think in practice that increases the need for TYPE_CHECKING. If you use deferred annotations, and you don't import the types you use, type checkers and IDEs will complain, and go to definition won't work. But if you import the types outside of a TYPE_CHECKING block it's wasteful, since you don't actually need the import for your code to run. So you need TYPE_CHECKING to satisfy your linter without impacting runtime performance.
> However, Python doesn’t have the compile-time check, because it’s an interpreted language that is dynamically-typed, which means its only real place to check is at runtime
Pet peeve: Python is compiled (into bytecode), so it could theoretically do checks at compile time. The "dynamically-typed" part is correct and is the real reason.
The problem for Python with type checking is that the language has essentially no compile-time constructs at all, just a bunch of assignment operators with different syntax: `def` assigns a function object to a variable when it's executed, `import` assigns a module, `class` assigns a class object. This means that everything involving names in Python is just a variable lookup (incidentally, this is why a function call must follow the declaration in Python: the variable is unbound if you haven't executed the function definition yet). The reason `Sequence` fails in the example in the article is simply that the code is trying to read an unassigned variable, no different from writing `vairable` instead of `variable` in your code.
TS is not interpreted. It compiles to JS, which is interpreted, but TS itself is not. There are no TS interpreters, they all compile to JS behind the scenes.
Or does Deno run TS natively? At a glance it looks like even its runtime is written in JS[0].
EDIT: looks like they have bindings for V8[1] so yeah, just a JS interpreter behind the scenes.
Small but important correction required in point 3 of the TL;DR:
> Python doesn’t care about types at runtime
This should say "Python doesn't care about type annotations at runtime."
Python _does_ care about types at runtime - but it doesn't use type annotations to compute them.
That doesn't detract from what's otherwise a really clear and helpful post. Python is living through a schizophrenic period in its evolution, and it's causing some problems. I hope it gets ironed out, though don't underestimate the difficulty. Python remains my go-to for many problems: a joy to use (for me) in many cases. But I definitely feel the need for static typing as codebase size increases. Having type annotations is good, but not being able to rely on them (i.e. static typing is not sound) adds to cognitive load and detracts from a confident development experience.
I will add that, in my experience, having type annotations that are unreliable, cumbersome, with so many edge cases and requiring this kind of magic (e.g. "if TYPE_CHECKING", etc) which is beyond most "casual" Python users effectively means type annotations are a very hard sell for most non-hardcore team mates.
I've been in a position to try improving processes and code quality of the Python codebase, and introducing mypy and type annotations has always been a very hard sell. The team just won't accept the benefits, given the difficulties in using it. If only Python made it easier and more reliable, but the current state is a mess.
I come from the static typing world, so of course I see the benefits. But most/all of my team mates don't, and so they see this as some really odd hoops I want them to jump through, for minimal benefit (in their eyes).
I think type annotations are extremely valuable even if you don't use mypy. On my team we basically use them as comments that are understood by the IDE, but aren't enforced. Of course comments can lie, but appeasing mypy really is a lot of work, and I would generally rather have a slightly dishonest but mostly correct type hint than some TypeVar monstrosity or, worse, Any.
It's arguable whether a SyntaxError happens as a compile-time check. It's true that it is raised before runtime. But a document that can be parsed according to the syntax definition is a necessary precondition to all checks of any static program property. Syntactic correctness in itself is not a static property of a program. A program is, by definition, syntactically correct.
Without a correct syntax there is no way to assign meaning or execute or interpret, etc.
One question comes to mind: isn't cyclic dependencies something to be resolved at the root cause (remove the cycle)? It's an issue that causes various pains down the line, like type checking in this case.
The issue is that, often, without type annotations there would be no cyclic dependency (because any type matching the implicit interface would work). So by introducing type annotations (which means extra imports), you end up having to refactor quite a lot of code and the end game is a one-type-per-module-with-separate-interface-definition system of the sort encountered in, well, statically-typed languages. If you don't do this you end up with the 'Type in string' annotations mentioned in the article and TYPE_CHECKING-gated imports. Both of which feel kind of hackish.
It's fine to have implicit circular dependencies in Python because duck typing means that every function is effectively generic. Type annotations eliminate this and turn Python into a fundamentally different language.
Type annotations aren't just extra boilerplate in function signatures, they also have these sorts of knock-on effects.
Broadly speaking, yes, but for some situations like database models with bi-directional relationships that can cause unnecessary maintenance burden simply for the benefit of type checking which this allows you to work around since the cyclical nature is only caused by typing not runtime semantics
Removing cycles may require extensive refactoring of the code, sometimes resulting in unintuitive layouts. All, just to solve the cyclic-import errors.
At the core of the problem is the fact that there's no thing as "partial import" in Python. When you do "from M import A" in Python, all the contents of M are evaluated and a reference to A is added to the current namespace. So, cyclic dependencies arise sooner or later, unless you adopt some very un-pythonic style for your codebase (e.g. one file per class).
In python it's very hard to avoid cyclic dependencies. Something as simple as a parent-child link between two classes is a cyclic dependency, and if the classes are in different files you have a car like in the blog post. This comes up especially hard when typing your codebase, because you have to name every type, and in python implementation is the interface, i.e. you can't just include a type definition file.
Could you expand on it? It is truly surprising for me that anyone would find code with type annotations to be significantly worse, for any reasons whatsoever.
On the contrary, I joyfully read and write code with type annotations. It is obviously very useful knowing which object types a function expects and which it will return.
Using `if TYPE_CHECKING` is also useful when you want to speed up the module load time by not importing unnecessary modules (unnecessary at module load time).
And then, also to resolve import cycles, where you still want to annotate the function argument types.
One problem though: If this is a type/object which is not needed for module load time (there it's just used to annotate some function arg type), but which will be needed inside the function, type checkers and IDEs (e.g. PyCharm) will not show any problem, but then at runtime, you will the NameError once it tries to use the type/object.
Example module:
This module will load just fine, because `torch` is only needed for type checking at module load time. However, once you call this `agi_model` function, it will fail because `torch` is actually unknown at runtime. However, no IDE will show any problem with this code. (Edit: Eclipse/PyDev handles it correctly now, see answer below.)Then you would add another `import torch` inside that `agi_model` function. However, then the IDE might even complain that this hides the outer scope `torch`.
That's not entirely True... I just checked it here and Eclipse/PyDev does flag this correctly -- since version 11.0.0 ;)
See: https://www.pydev.org/history_pydev.html (Imports found inside a typing.TYPE_CHECKING will be considered undefined if the scope that uses it requires it to be available when not type-checking).
If you don't do runtime typechecking in the entire program, sure.
[1] Except some light scripting
``` import type {foo} from ... ```
E.g. for `import { SomeType } from "backend-server"`, you don't want to include your entire server code in the frontend just because you imported the response type of some JSON API. `import type` neatly solves this issue, and it even enforces that you can't import anything but types from your backend.
I also want to state that TS already removes regular `import`s that only import types. `import type` is mostly used to help bundlers. https://www.typescriptlang.org/docs/handbook/release-notes/t...
https://peps.python.org/pep-0649/
Pet peeve: Python is compiled (into bytecode), so it could theoretically do checks at compile time. The "dynamically-typed" part is correct and is the real reason.
Or does Deno run TS natively? At a glance it looks like even its runtime is written in JS[0].
EDIT: looks like they have bindings for V8[1] so yeah, just a JS interpreter behind the scenes.
[0] https://github.com/denoland/deno/tree/main/runtime/js
[1] https://github.com/denoland/rusty_v8
> Python doesn’t care about types at runtime
This should say "Python doesn't care about type annotations at runtime."
Python _does_ care about types at runtime - but it doesn't use type annotations to compute them.
That doesn't detract from what's otherwise a really clear and helpful post. Python is living through a schizophrenic period in its evolution, and it's causing some problems. I hope it gets ironed out, though don't underestimate the difficulty. Python remains my go-to for many problems: a joy to use (for me) in many cases. But I definitely feel the need for static typing as codebase size increases. Having type annotations is good, but not being able to rely on them (i.e. static typing is not sound) adds to cognitive load and detracts from a confident development experience.
I will add that, in my experience, having type annotations that are unreliable, cumbersome, with so many edge cases and requiring this kind of magic (e.g. "if TYPE_CHECKING", etc) which is beyond most "casual" Python users effectively means type annotations are a very hard sell for most non-hardcore team mates.
I've been in a position to try improving processes and code quality of the Python codebase, and introducing mypy and type annotations has always been a very hard sell. The team just won't accept the benefits, given the difficulties in using it. If only Python made it easier and more reliable, but the current state is a mess.
I come from the static typing world, so of course I see the benefits. But most/all of my team mates don't, and so they see this as some really odd hoops I want them to jump through, for minimal benefit (in their eyes).
Yes it does. All Python source code is parsed and compiled into bytecode. SyntaxError is raised before runtime.
Without a correct syntax there is no way to assign meaning or execute or interpret, etc.
It's fine to have implicit circular dependencies in Python because duck typing means that every function is effectively generic. Type annotations eliminate this and turn Python into a fundamentally different language.
Type annotations aren't just extra boilerplate in function signatures, they also have these sorts of knock-on effects.
At the core of the problem is the fact that there's no thing as "partial import" in Python. When you do "from M import A" in Python, all the contents of M are evaluated and a reference to A is added to the current namespace. So, cyclic dependencies arise sooner or later, unless you adopt some very un-pythonic style for your codebase (e.g. one file per class).
What do you mean? Where is the cycle?
You're in Python. Either embrace it or use another language.
On the contrary, I joyfully read and write code with type annotations. It is obviously very useful knowing which object types a function expects and which it will return.