I strongly think Python should have more functional programming support. Functional Programming Languages are usually scary to look at for many programmers. Python would not only be a good FP introduction to them, it would also benefit greatly.
Years ago I found out the Guido wouldn't let tail recursion included, and even tried to remove map function from built in functions. Therefore I got the impression that python wouldn't have further support in FP. I really wish that is not the case. With the coming pattern matching in 3.10, My hope is high again.
I have very high respect for Guido van Rossum, I'm not here to discredit him. He's one of the authors in PEP 634.
I wish python would have simpler syntax for lambda which is currently using the lambda expression, even JS is doing better on this. A built-in syntax for partial would also be great. It could be even better if we can have function composition.
Some problems are better solved the FP way. It could even make the program more readable which is one of the strengths of Python.
Yes, coming back to Python after many years with Kotlin and Elm, I see how the Python language guides people to write code in a more unreadable way.
Mutations everywhere. But mainly because mapping and flatting stuff is so burdensome. And it lacking many fp functions and a proper way to call them ergonomically makes a crazy list comprehension the goto tool, which often is much less explicit about what's going on than calling a function with a named and widely understood concept.
> And it lacking many fp functions and a proper way to call them ergonomically makes a crazy list comprehension the goto tool, which often is much less explicit about what's going on than calling a function with a named and widely understood concept.
I've said roughly this before somewhere on HN but cannot find it right now: list comprehensions are like regexes. Below a certain point in complexity, they're a much cleaner and immediately-graspable way of expressing what's going on in a given piece of code. Past that point, they rapidly become far worse than the alternative.
For example, "{func(y): y for x, y in somedict.items() if filterfunc(x)}" is clearer than the equivalent loop-with-tempvars, and significantly clearer than "dict(map(lambda k: (func(somedict[k]), somedict[k]), filter(filterfunc, somedict.keys()))))". Even with some better mapping utilities (something like "mapkeys" or "filtervalues", or a mapping utility for dictionaries that inferred keys-vs-values based on the arity of map functions), I think the "more functional" version remains the least easily intelligible.
However, once you cross into nested or simultaneous comprehensions, you really need to stop, drop, and bust out some loops, named functions, and maybe a comment or two. So too with regex! Think about the regex "^prefix[.](\S+) [1-7]suffix$"; writing the equivalent stack of splits and conditionals would be more confusing and easier to screw up than using the regex; below a point of complexity it's a much clearer and better tool to slice up strings. Past a point regexes, too, break down (for me that point is roughly "the regex doesn't fit on a line" or "it has lookaround expressions", but others may have different standards here).
Please suggest a pattern matching expression syntax that doesn’t require completely changing how the language works.
Pattern matching took so long because it was extremely hard to find that compromise, and I actually think they did an okay job. No, it won’t cover every case, but it will also cover the important ones.
Learn You a Haskell for Great Good taught me list comprehension, and I was surprised to see such a pithy shorthand available in Python (+ dict comprehension). That's a very functional way to transform objects from one shape to another with an in-line expression, no lambda keyword there.
lambda keyword is better than nothing, it definitely can be improved. Just imaging using javascript syntax in your example.
> To your point, I only recently learned there's a Map function in Python, while in JS I'm .map(x=>y).filter(x=>y).reduce(x=>y)ing left and right.
I think with the introduction of list comprehension Guido saw map function was no longer needed, that was why he wanted it removed. I don't deny it, but using map and filter sometimes are just easier to read. Say [foo(v) for v in a] vs map(foo, a).
One gripe that I have with functions like map is that it returns a generator, so you have to be careful when reusing results. I fell into this trap a few times.
I'd also like a simpler syntax for closures, it would make writing embedded DSLs less cumbersome.
> One gripe that I have with functions like map is that it returns a generator, so you have to be careful when reusing results
I hope that is never changed; I often write code in which the map function's results are very large, and keeping those around by default would cause serious memory consumption even when I am mapping over a sequence (rather than another generator).
Instead, I'd advocate the opposite extreme: Python makes it a little too easy to make sequences/vectors (with comprehensions); I wish generators were the default in more cases, and that it was harder to accidentally reify things into a list/tuple/whatever.
I think that if the only comprehension syntax available was the one that created a generator--"(_ for _ in _)"--and you always had to explicitly reify it by calling list(genexp) or tuple(genexp) if you wanted a sequence, then the conventions in the Python ecosystem would be much more laziness-oriented and more predictable memory-consumption wise.
i think thats the most powerful part of the tool. being able to effortlessly write lazy generators is absolutely amazing when working with any sort of I/O system or async code.
> It could be even better if we can have function composition.
I think composition and piping are such basic programming tools that make a lot of code much cleaner. It's a shame they're not built-in in Python.
So, shameless plug, in the spirit of functools and itertools I made the pipetools library [0]. On top of forward-composition and piping it also enables more concise lambda expressions (e.g. X + 1) and more powerful partial application.
First you claim functional languages are scary to look at, then say that you want Python to become more like functional languages. But maybe the reason Python is elegant and easier to read is exactly because Guido had the self-constraint to not go full functional. You also miss the part of the story where he actually did remove `reduce` from builtin, exactly because of how unreadable and confusing it is to most.
It's exactly that kind of decision making I expected from a BDFL, and I think Guido did a great job while he was one keeping Python from going down such paths.
Well, the BDFL is probably the only dictator we all love.
Any decision he made is infinitely more than I could. Because I am just a python user, and an outsider in any decision making process. So for me, he's right all the time. That's a perfect definition of a dictator :)
But I do have wishes. It's like I love my parents but I do want to stay up late sometimes.
Yeah, I totally missed the part he removed reduce from builtin. Sorry about my memory. map, filter, or reduce, it does not matter. As I stated, some problems are better solved functional way. Because Python is such a friendly language, if it includes functional paradigm properly, it would make the functional part more readable than other functional languages.
FP is scary not because it has evil syntax to keep people at distance, it's just an alien paradigm to many. Lot's of non functional languages has functional support, which doesn't make them less readable. E.g. C#, JS. I suspect these languages have helped many understanding FP more. Python could make the jump by including more FP, but not turning into a full-fledged FP.
The worst part of python is the lack of utility functions for munging collections. But it sits at a slightly higher level than this - things that are idiomatic in other scripting languages like groupBy(fn), sortBy(fn), countBy(fn), collate, are all inexplicably like IKEA self assembled furniture in Python instead of first class entities. It makes lots of routine data munging and algorithmic work much more annoying than it needs to be (comparing to Groovy, Ruby, etc).
The real issue is that python doesn’t really make it easy to define anonymous functions. If you want one with more than one line, you need to name it and move its definition outside the point of use. Quite annoying.
itertools.groupby isn't really the groupBy operation that people would normally expect. It looks like it would do a SQL-style "group by" where it categorizes elements across the collection, but really it only groups adjacent elements, so you end up with the same group key multiple times, which can cause subtle bugs. From my experience, it's more common for it to be misused than used correctly, so at my work we have a lint rule disallowing it. IMO this surprising behavior is one of the unfriendliest parts of Python when it comes to collection manipulation.
Annoyingly, itertools is one of those packages in the Python standard library with wordsruntogether-named functions, so it's really known as itertools.groupby instead.
It's a package, not a built in, but pytoolz [0] is a very complete solution to the type of functional programming munging you are taking about. I wish more people knew about it (no one here seems to have mentioned it). And it has been Cython optimized in another package called cytoolz [1]. The author explains in the Heritage section how it is an extension of and plays nice with itertools and functools [2].
For Python coders new to functional programming, and how it can make working with data easier, I highly recommend reading the following sections of the pytoolz docs: Composability, Function Purity, Laziness, and Control Flow [0].
Yes - the description of writing Python as like assembling Ikea furniture is absolutely spot-on. Yes, you can do it, and the results can be nice, but by God it is sometimes such a pain.
The comment showing where groipBy, sortBy etc can be found just shows the problem - they are all in different libraries.That's just plain annoying! And don't get me started on the pain of trying to build an Ordered Dictionary with a default initial value!
What leaps out at me is that these 3 functions are all straight out of relational algebra style worldview.
Python the language doesn't support relational algebra as a first class concept. The reason it feels like IKEA self assembly is probably because you are implicitly implementing a data model that isn't how Python thinks about collections.
groupBy(fn): {fn(x): x for x in foo}
sortBy(fn): sorted(foo, key=fn)
countBy(fn): Counter(fn(x) for x in foo)
flatten: [x for y in foo for x in y]
filter(fn): [x for x in foo if fn(x)]
All the "for x in blah" can seem like a lot of boilerplate for a non-Pythonista, but actually it becomes subconscious once you're used to it and actually helps you feel the structure (a bit like indentation isn't necessary in C but it still helps you to see it).
For compound operations (e.g. merge two lists, filter and flatten), I find the code to be a lot more easier to "feel" than if you'd combined several functional style functions, where you have to read the function names instead of just seeing the structure.
One of the harder things for beginners to grasp when learning a second or third programming language is that language is not just syntax. You see it all the time in Python in particular: people writing code like it's C or Java. It's no different to grabbing a French dictionary and transliterating your English sentence to French word by word. It just doesn't work and, in some cases, is completely wrong. Grokking a language means so much more than knowing the key words for doing all the stuff you used to do in your old language.
Most popular languages offer similar operations on collections these days. I've been using Java, Scala, JS/TS, purescript, Ruby, C++, Rust in the past, and these days all have a similar offering to iterate over collections using idiomatic functions such as map/filter/sort/etc.
I'm working on some python these days and I find quite unpleasant the 1-line lambdas, and general messy options to implement common collections operations.
10 years is very optimistic: flatten has little use without a functional pipeline (something like clojure’s threading macro) as in other contexts it’s a trivial comprehension.
I think the motivation there historically was to encourage use of generators and generator expressions — it’s definitely intentional, because things like `apply` and `reduce` used to be builtins but now they’re imports.
functools and itertools are amazing and I love them both. They are especially useful for teaching high school CS without having to stray from Python, which the kids at all levels know well and are comfortable with.
However.
Using @cached_property feels like a bad code smell and it’s a controversial design decision. The example given is more like an instance of a factory? Perhaps if the result of the render function was an Important Object™ (not just a mere string) then the function’s callers might not be so cavalier about discarding the generated instance.
I would like to see the calling code that calls `render` more than once in two different places (hence the need for caching with no option for cache invalidation?). When every stack frame is a descendant of main() then there’s no such thing as “two different places”!
def main():
page = HN().render()
while ‘too true’:
do_work(page)
tea_break(page)
Because of the GIL the cache is often pretty useless on a webserver anyway, as each and every request gets a different process not sharing that cache. So then one has to deploy Redis in addition and complicate the stack. Makes sense with a shared cache when there's multiple instances, but often it's overkill. Having to go "external" to fetch the data I might as well not cache anything.
It’s just a smell. It hints that instead of passing the instance, one should instead be passing the output of the function.
It also doesn’t really matter. The only time things like this have actually been important was in giant Java codebases, where attempts were made to keep packages cleanly separated (and so therefore passing the simplest possible public types reduces the amount of public classes).
That's one of the thing I love about JS. Sure, you don't have all the good static typing that comes with Scala or OCaml, however map, filter, reduce and lambdas are easily accessible. We may get record, tuples, pattern matching and a pipeline operator at some point!
Big shoutout to lru_cache. I tossed in two lines of code and was able to get a 60x speedup in my code by reducing the amount of regular expression compilations I had to do.
Years ago I found out the Guido wouldn't let tail recursion included, and even tried to remove map function from built in functions. Therefore I got the impression that python wouldn't have further support in FP. I really wish that is not the case. With the coming pattern matching in 3.10, My hope is high again.
I have very high respect for Guido van Rossum, I'm not here to discredit him. He's one of the authors in PEP 634.
I wish python would have simpler syntax for lambda which is currently using the lambda expression, even JS is doing better on this. A built-in syntax for partial would also be great. It could be even better if we can have function composition.
Some problems are better solved the FP way. It could even make the program more readable which is one of the strengths of Python.
Mutations everywhere. But mainly because mapping and flatting stuff is so burdensome. And it lacking many fp functions and a proper way to call them ergonomically makes a crazy list comprehension the goto tool, which often is much less explicit about what's going on than calling a function with a named and widely understood concept.
I've said roughly this before somewhere on HN but cannot find it right now: list comprehensions are like regexes. Below a certain point in complexity, they're a much cleaner and immediately-graspable way of expressing what's going on in a given piece of code. Past that point, they rapidly become far worse than the alternative.
For example, "{func(y): y for x, y in somedict.items() if filterfunc(x)}" is clearer than the equivalent loop-with-tempvars, and significantly clearer than "dict(map(lambda k: (func(somedict[k]), somedict[k]), filter(filterfunc, somedict.keys()))))". Even with some better mapping utilities (something like "mapkeys" or "filtervalues", or a mapping utility for dictionaries that inferred keys-vs-values based on the arity of map functions), I think the "more functional" version remains the least easily intelligible.
However, once you cross into nested or simultaneous comprehensions, you really need to stop, drop, and bust out some loops, named functions, and maybe a comment or two. So too with regex! Think about the regex "^prefix[.](\S+) [1-7]suffix$"; writing the equivalent stack of splits and conditionals would be more confusing and easier to screw up than using the regex; below a point of complexity it's a much clearer and better tool to slice up strings. Past a point regexes, too, break down (for me that point is roughly "the regex doesn't fit on a line" or "it has lookaround expressions", but others may have different standards here).
The irrational hatred of FP is still there, to the point of the absurdity of implementing a hobbled procedural version of pattern matching!
It is bordering on insane.
Pattern matching took so long because it was extremely hard to find that compromise, and I actually think they did an okay job. No, it won’t cover every case, but it will also cover the important ones.
But even the lambda keyword isn't so bad, you can create a dictionary of expressions to call by name, a lot more compact them declaring them the usual way imo: https://github.com/jazzyjackson/py-validate/blob/master/pyva...
To your point, I only recently learned there's a Map function in Python, while in JS I'm .map(x=>y).filter(x=>y).reduce(x=>y)ing left and right.
> But even the lambda keyword isn't so bad, you can create a dictionary of expressions to call by name, a lot more compact them declaring them the usual way imo: https://github.com/jazzyjackson/py-validate/blob/master/pyva...
lambda keyword is better than nothing, it definitely can be improved. Just imaging using javascript syntax in your example.
> To your point, I only recently learned there's a Map function in Python, while in JS I'm .map(x=>y).filter(x=>y).reduce(x=>y)ing left and right.
I think with the introduction of list comprehension Guido saw map function was no longer needed, that was why he wanted it removed. I don't deny it, but using map and filter sometimes are just easier to read. Say [foo(v) for v in a] vs map(foo, a).
[1]: https://stackoverflow.com/questions/1247486/list-comprehensi...
One gripe that I have with functions like map is that it returns a generator, so you have to be careful when reusing results. I fell into this trap a few times.
I'd also like a simpler syntax for closures, it would make writing embedded DSLs less cumbersome.
I hope that is never changed; I often write code in which the map function's results are very large, and keeping those around by default would cause serious memory consumption even when I am mapping over a sequence (rather than another generator).
Instead, I'd advocate the opposite extreme: Python makes it a little too easy to make sequences/vectors (with comprehensions); I wish generators were the default in more cases, and that it was harder to accidentally reify things into a list/tuple/whatever.
I think that if the only comprehension syntax available was the one that created a generator--"(_ for _ in _)"--and you always had to explicitly reify it by calling list(genexp) or tuple(genexp) if you wanted a sequence, then the conventions in the Python ecosystem would be much more laziness-oriented and more predictable memory-consumption wise.
Ah well, water under the bridge, I know.
I think composition and piping are such basic programming tools that make a lot of code much cleaner. It's a shame they're not built-in in Python.
So, shameless plug, in the spirit of functools and itertools I made the pipetools library [0]. On top of forward-composition and piping it also enables more concise lambda expressions (e.g. X + 1) and more powerful partial application.
[0] https://0101.github.io/pipetools/doc/
First you claim functional languages are scary to look at, then say that you want Python to become more like functional languages. But maybe the reason Python is elegant and easier to read is exactly because Guido had the self-constraint to not go full functional. You also miss the part of the story where he actually did remove `reduce` from builtin, exactly because of how unreadable and confusing it is to most.
It's exactly that kind of decision making I expected from a BDFL, and I think Guido did a great job while he was one keeping Python from going down such paths.
Any decision he made is infinitely more than I could. Because I am just a python user, and an outsider in any decision making process. So for me, he's right all the time. That's a perfect definition of a dictator :)
But I do have wishes. It's like I love my parents but I do want to stay up late sometimes.
Yeah, I totally missed the part he removed reduce from builtin. Sorry about my memory. map, filter, or reduce, it does not matter. As I stated, some problems are better solved functional way. Because Python is such a friendly language, if it includes functional paradigm properly, it would make the functional part more readable than other functional languages.
FP is scary not because it has evil syntax to keep people at distance, it's just an alien paradigm to many. Lot's of non functional languages has functional support, which doesn't make them less readable. E.g. C#, JS. I suspect these languages have helped many understanding FP more. Python could make the jump by including more FP, but not turning into a full-fledged FP.
BTW. I'm still glad reduce is kept in functools.
Functional programming is kind of cool unless it's someone else's code and you need to debug it.
- groupBy is itertools.groupBy(lst, fn)
- sortBy is just lst.sort(key=fn)
- countBy is collections.Counter(map(fn, lst))
- Sibling comment mentioned flatten, which is just [item for sublist for sublist in lst]
More esoteric needs are usually met by itertools.
https://docs.python.org/3/library/itertools.html#itertools.g...
You mean sorted(). list.sort only works on list and in-place.
I never get this one right first time, but surely that's not it.
For Python coders new to functional programming, and how it can make working with data easier, I highly recommend reading the following sections of the pytoolz docs: Composability, Function Purity, Laziness, and Control Flow [0].
[0] https://toolz.readthedocs.io/en/latest/
[1] https://github.com/pytoolz/cytoolz
[2] https://toolz.readthedocs.io/en/latest/heritage.html
The comment showing where groipBy, sortBy etc can be found just shows the problem - they are all in different libraries.That's just plain annoying! And don't get me started on the pain of trying to build an Ordered Dictionary with a default initial value!
What leaps out at me is that these 3 functions are all straight out of relational algebra style worldview.
Python the language doesn't support relational algebra as a first class concept. The reason it feels like IKEA self assembly is probably because you are implicitly implementing a data model that isn't how Python thinks about collections.
For compound operations (e.g. merge two lists, filter and flatten), I find the code to be a lot more easier to "feel" than if you'd combined several functional style functions, where you have to read the function names instead of just seeing the structure.
I'm working on some python these days and I find quite unpleasant the 1-line lambdas, and general messy options to implement common collections operations.
If you reach for itertools imports often in an interactive REPL, you might be interested in Pyflyby: https://labs.quansight.org/blog/2021/07/pyflyby-improving-ef...
https://towardsdatascience.com/functools-the-power-of-higher...
functools and itertools are amazing and I love them both. They are especially useful for teaching high school CS without having to stray from Python, which the kids at all levels know well and are comfortable with.
However.
Using @cached_property feels like a bad code smell and it’s a controversial design decision. The example given is more like an instance of a factory? Perhaps if the result of the render function was an Important Object™ (not just a mere string) then the function’s callers might not be so cavalier about discarding the generated instance.
I would like to see the calling code that calls `render` more than once in two different places (hence the need for caching with no option for cache invalidation?). When every stack frame is a descendant of main() then there’s no such thing as “two different places”!
Ironically it doesn’t feel very functional.What web server is this? I’ve never heard about this behavior before.
What if we have multiple containers running the same service and load balanced using an ingress controller?
Local Caches are a headache in such a scenario.
Best pattern I have read is use a cache as a sidecar in a pod. I feel that scales very well.
Why?
It also doesn’t really matter. The only time things like this have actually been important was in giant Java codebases, where attempts were made to keep packages cleanly separated (and so therefore passing the simplest possible public types reduces the amount of public classes).
FP gains are real. Anyone who tells you otherwise doesn't know what FP is.
To anyone out there who isn't on the FP train yet, get on. You'll become a better programmer for it.
Deleted Comment
The last time I was surprised was the itertools library.
Also check https://pymotw.com/3/. It's a great tour of the Python standard library
- Lack of pipe operator
- Multiple arguments everywhere instead of currying
- No do-notation or equivalent
- Reference equality rather than structural equality for objects etc.
If you want to program in this style, consider using F#, R or even JavaScript with a few Babel plugins.
Combine it with Lodash FP: https://github.com/lodash/lodash/wiki/FP-Guide