This wasn't mentioned, but the constant validation on construction also costs something. Sometimes it's a cost you're willing to pay (again, dealing with external inputs), sometimes it's extraneous because e.g. a typechecker would suffice to catch discrepancies at build time.
But you are absolutely right. To add a little: In practice, if a third-party library hardly ever changes and makes life dramatically easier, you can consciously decide to accept the coupling in your domain, but that should be the exception, not the rule.
Pydantic is great at turning large, nested dictionaries into validated objects, yet none of that power solves a domain problem. Inside the domain you only need plain data and behaviour: pure dataclasses already give you that without extra baggage. And that's the main reason to leave it out.
The less your domain knows about the outside world, the less often you have to touch it when the outside world moves. And the easier it becomes for any new team member to adopt that logic: no extra mental model, no hidden framework magic, just the business concepts in plain Python. And exactly what you mentioned: if you ever want to drop Pydantic, you don't need to touch the domain. The less you have to touch, the easier it's to replace.
So the guideline is simple: dependencies point inward. Keep the domain free of third-party imports, and let Pydantic stay where it belongs, in the outside layers.
He doesn't even say why you should tediously duplicate everything instead of just using the Pydantic objects - just "You know you don’t want that"! No I don't.
The only reason I've heard is performance... but... you're using Python. You don't give a shit about performance.
In fact, a reader who emailed me ran into a challenge where, if you have an aggregate with just one entity, for example, Bookcase -> list[Book] , and that list grows significantly, it can lead to performance issues. In such cases, you might even need to consider a solution to improve upon that. But that's a separate topic.
What I was trying to highlight earlier were the whys behind the approach. And based on the feedback over here, it might be a good idea to update the post. I really appreciate all your input.
As for the whys: The less your domain knows about the outside world, the less often you need to change it when the outside world changes. And the easier it becomes for new team members to understand the logic. It also separates your database models from your domain models, which is great IMHO. It makes it easier to change them independent from each other. You could have both, separated domain models and database models or API models and use Pydantic for all these layers, but why would you do that? If you need to make the translation anyways, why not to pure dataclasses?: no extra mental models, no hidden framework magic, just business concepts in plain Python. This does depend on your specific situation however, there are enough reasons to not do this. But if your application grows in size and is not so much a simple CRUD application anymore, I wonder if there are enough reasons to NOT keep Pydantic in the outside layers. So yes, for small simple applications it might be overcomplicated to introduce the overhead when your data stay relatively consistent across layers.