If you have many inserts and deletes on a table, the table will build up tombstones and postgres will eventually be forced to vacuum the table. This doesn't block normal operation, but auto vacuums on large tables can be resource intensive - especially on the storage/io side. And this - at worst - can turn into a resource contention so you either end up with an infinite auto vacuum (because the vacuum can't keep up fast enough), or a severe performance impact on all queries on the system (and since this is your postgres-as-redis, there is a good chance all of the hot paths rely on the cache and get slowed down significantly).
Both of these result in different kinds of fun - either your applications just stop working because postgres is busy cleaning up, or you end up with some horrible table bloat in the future, which will take hours and hours of application downtime to fix, because your drives are fast, but not that fast.
There are ways to work around this, naturally. You could have an expiration key with an index on it, and do "select * from cache order by expiration_key desc limit 1", and throw pg_partman at it to partition the table based on the expiration key, and drop old values by dropping partitions and such... but at some point you start wondering if using a system meant for this kinda workload is easier.
>> python3.10 create_task_overhead.py
100,000 tasks 185,694 tasks per/s
200,000 tasks 165,581 tasks per/s
300,000 tasks 170,857 tasks per/s
400,000 tasks 159,081 tasks per/s
500,000 tasks 162,640 tasks per/s
600,000 tasks 158,779 tasks per/s
700,000 tasks 161,779 tasks per/s
800,000 tasks 179,965 tasks per/s
900,000 tasks 160,913 tasks per/s
1,000,000 tasks 162,767 tasks per/s
>> python3.11 create_task_overhead.py
100,000 tasks 289,318 tasks per/s
200,000 tasks 265,293 tasks per/s
300,000 tasks 266,011 tasks per/s
400,000 tasks 259,821 tasks per/s
500,000 tasks 251,819 tasks per/s
600,000 tasks 267,441 tasks per/s
700,000 tasks 251,789 tasks per/s
800,000 tasks 254,303 tasks per/s
900,000 tasks 249,894 tasks per/s
1,000,000 tasks 266,581 tasks per/s