Readit News logoReadit News
frogcoder · 4 years ago
I strongly think Python should have more functional programming support. Functional Programming Languages are usually scary to look at for many programmers. Python would not only be a good FP introduction to them, it would also benefit greatly.

Years ago I found out the Guido wouldn't let tail recursion included, and even tried to remove map function from built in functions. Therefore I got the impression that python wouldn't have further support in FP. I really wish that is not the case. With the coming pattern matching in 3.10, My hope is high again.

I have very high respect for Guido van Rossum, I'm not here to discredit him. He's one of the authors in PEP 634.

I wish python would have simpler syntax for lambda which is currently using the lambda expression, even JS is doing better on this. A built-in syntax for partial would also be great. It could be even better if we can have function composition.

Some problems are better solved the FP way. It could even make the program more readable which is one of the strengths of Python.

matsemann · 4 years ago
Yes, coming back to Python after many years with Kotlin and Elm, I see how the Python language guides people to write code in a more unreadable way.

Mutations everywhere. But mainly because mapping and flatting stuff is so burdensome. And it lacking many fp functions and a proper way to call them ergonomically makes a crazy list comprehension the goto tool, which often is much less explicit about what's going on than calling a function with a named and widely understood concept.

zbentley · 4 years ago
> And it lacking many fp functions and a proper way to call them ergonomically makes a crazy list comprehension the goto tool, which often is much less explicit about what's going on than calling a function with a named and widely understood concept.

I've said roughly this before somewhere on HN but cannot find it right now: list comprehensions are like regexes. Below a certain point in complexity, they're a much cleaner and immediately-graspable way of expressing what's going on in a given piece of code. Past that point, they rapidly become far worse than the alternative.

For example, "{func(y): y for x, y in somedict.items() if filterfunc(x)}" is clearer than the equivalent loop-with-tempvars, and significantly clearer than "dict(map(lambda k: (func(somedict[k]), somedict[k]), filter(filterfunc, somedict.keys()))))". Even with some better mapping utilities (something like "mapkeys" or "filtervalues", or a mapping utility for dictionaries that inferred keys-vs-values based on the arity of map functions), I think the "more functional" version remains the least easily intelligible.

However, once you cross into nested or simultaneous comprehensions, you really need to stop, drop, and bust out some loops, named functions, and maybe a comment or two. So too with regex! Think about the regex "^prefix[.](\S+) [1-7]suffix$"; writing the equivalent stack of splits and conditionals would be more confusing and easier to screw up than using the regex; below a point of complexity it's a much clearer and better tool to slice up strings. Past a point regexes, too, break down (for me that point is roughly "the regex doesn't fit on a line" or "it has lookaround expressions", but others may have different standards here).

mjburgess · 4 years ago
They added "pattern matching" as a statement.

The irrational hatred of FP is still there, to the point of the absurdity of implementing a hobbled procedural version of pattern matching!

It is bordering on insane.

toxik · 4 years ago
Please suggest a pattern matching expression syntax that doesn’t require completely changing how the language works.

Pattern matching took so long because it was extremely hard to find that compromise, and I actually think they did an okay job. No, it won’t cover every case, but it will also cover the important ones.

jazzyjackson · 4 years ago
Learn You a Haskell for Great Good taught me list comprehension, and I was surprised to see such a pithy shorthand available in Python (+ dict comprehension). That's a very functional way to transform objects from one shape to another with an in-line expression, no lambda keyword there.

But even the lambda keyword isn't so bad, you can create a dictionary of expressions to call by name, a lot more compact them declaring them the usual way imo: https://github.com/jazzyjackson/py-validate/blob/master/pyva...

To your point, I only recently learned there's a Map function in Python, while in JS I'm .map(x=>y).filter(x=>y).reduce(x=>y)ing left and right.

frogcoder · 4 years ago
Yes! List/Dictionary/Generator comprehension is one big plus for Python, it probably came from the functional world. I use it whenever I can.

> But even the lambda keyword isn't so bad, you can create a dictionary of expressions to call by name, a lot more compact them declaring them the usual way imo: https://github.com/jazzyjackson/py-validate/blob/master/pyva...

lambda keyword is better than nothing, it definitely can be improved. Just imaging using javascript syntax in your example.

> To your point, I only recently learned there's a Map function in Python, while in JS I'm .map(x=>y).filter(x=>y).reduce(x=>y)ing left and right.

I think with the introduction of list comprehension Guido saw map function was no longer needed, that was why he wanted it removed. I don't deny it, but using map and filter sometimes are just easier to read. Say [foo(v) for v in a] vs map(foo, a).

0x008 · 4 years ago
Yes but map in Python is riduclously slow when map is combined with a lambda [1]

[1]: https://stackoverflow.com/questions/1247486/list-comprehensi...

patterns · 4 years ago
I agree.

One gripe that I have with functions like map is that it returns a generator, so you have to be careful when reusing results. I fell into this trap a few times.

I'd also like a simpler syntax for closures, it would make writing embedded DSLs less cumbersome.

zbentley · 4 years ago
> One gripe that I have with functions like map is that it returns a generator, so you have to be careful when reusing results

I hope that is never changed; I often write code in which the map function's results are very large, and keeping those around by default would cause serious memory consumption even when I am mapping over a sequence (rather than another generator).

Instead, I'd advocate the opposite extreme: Python makes it a little too easy to make sequences/vectors (with comprehensions); I wish generators were the default in more cases, and that it was harder to accidentally reify things into a list/tuple/whatever.

I think that if the only comprehension syntax available was the one that created a generator--"(_ for _ in _)"--and you always had to explicitly reify it by calling list(genexp) or tuple(genexp) if you wanted a sequence, then the conventions in the Python ecosystem would be much more laziness-oriented and more predictable memory-consumption wise.

Ah well, water under the bridge, I know.

ehsankia · 4 years ago
Simply wrap it in list(), or use a list comprehension. Most things in python3 are iterators now, so the same thing applies to many things.
pantsforbirds · 4 years ago
i think thats the most powerful part of the tool. being able to effortlessly write lazy generators is absolutely amazing when working with any sort of I/O system or async code.
architech · 4 years ago
> It could be even better if we can have function composition.

I think composition and piping are such basic programming tools that make a lot of code much cleaner. It's a shame they're not built-in in Python.

So, shameless plug, in the spirit of functools and itertools I made the pipetools library [0]. On top of forward-composition and piping it also enables more concise lambda expressions (e.g. X + 1) and more powerful partial application.

[0] https://0101.github.io/pipetools/doc/

ehsankia · 4 years ago
Aren't you contradicting yourself a bit?

First you claim functional languages are scary to look at, then say that you want Python to become more like functional languages. But maybe the reason Python is elegant and easier to read is exactly because Guido had the self-constraint to not go full functional. You also miss the part of the story where he actually did remove `reduce` from builtin, exactly because of how unreadable and confusing it is to most.

It's exactly that kind of decision making I expected from a BDFL, and I think Guido did a great job while he was one keeping Python from going down such paths.

frogcoder · 4 years ago
Well, the BDFL is probably the only dictator we all love.

Any decision he made is infinitely more than I could. Because I am just a python user, and an outsider in any decision making process. So for me, he's right all the time. That's a perfect definition of a dictator :)

But I do have wishes. It's like I love my parents but I do want to stay up late sometimes.

Yeah, I totally missed the part he removed reduce from builtin. Sorry about my memory. map, filter, or reduce, it does not matter. As I stated, some problems are better solved functional way. Because Python is such a friendly language, if it includes functional paradigm properly, it would make the functional part more readable than other functional languages.

FP is scary not because it has evil syntax to keep people at distance, it's just an alien paradigm to many. Lot's of non functional languages has functional support, which doesn't make them less readable. E.g. C#, JS. I suspect these languages have helped many understanding FP more. Python could make the jump by including more FP, but not turning into a full-fledged FP.

BTW. I'm still glad reduce is kept in functools.

collyw · 4 years ago
I was going to say something to that effect.

Functional programming is kind of cool unless it's someone else's code and you need to debug it.

zmmmmm · 4 years ago
The worst part of python is the lack of utility functions for munging collections. But it sits at a slightly higher level than this - things that are idiomatic in other scripting languages like groupBy(fn), sortBy(fn), countBy(fn), collate, are all inexplicably like IKEA self assembled furniture in Python instead of first class entities. It makes lots of routine data munging and algorithmic work much more annoying than it needs to be (comparing to Groovy, Ruby, etc).
smallnamespace · 4 years ago
Not sure what collate is supposed to be, but these idioms are hardly difficult if you're used to Python:

- groupBy is itertools.groupBy(lst, fn)

- sortBy is just lst.sort(key=fn)

- countBy is collections.Counter(map(fn, lst))

- Sibling comment mentioned flatten, which is just [item for sublist for sublist in lst]

More esoteric needs are usually met by itertools.

bobbylarrybobby · 4 years ago
The real issue is that python doesn’t really make it easy to define anonymous functions. If you want one with more than one line, you need to name it and move its definition outside the point of use. Quite annoying.
alangpierce · 4 years ago
itertools.groupby isn't really the groupBy operation that people would normally expect. It looks like it would do a SQL-style "group by" where it categorizes elements across the collection, but really it only groups adjacent elements, so you end up with the same group key multiple times, which can cause subtle bugs. From my experience, it's more common for it to be misused than used correctly, so at my work we have a lint rule disallowing it. IMO this surprising behavior is one of the unfriendliest parts of Python when it comes to collection manipulation.

https://docs.python.org/3/library/itertools.html#itertools.g...

sparsely · 4 years ago
I think you've kind of proven the op's point, these all have different patterns and are in different places.
gfaure · 4 years ago
Annoyingly, itertools is one of those packages in the Python standard library with wordsruntogether-named functions, so it's really known as itertools.groupby instead.
masklinn · 4 years ago
> - sortBy is just lst.sort(key=fn)

You mean sorted(). list.sort only works on list and in-place.

dmurray · 4 years ago
> Sibling comment mentioned flatten, which is just [item for sublist for sublist in lst]

I never get this one right first time, but surely that's not it.

sleavey · 4 years ago
By collate, OP might be referring to the behaviour provided by the builtin `zip`.
KingOfCoders · 4 years ago
Q.E.D.
hcrisp · 4 years ago
It's a package, not a built in, but pytoolz [0] is a very complete solution to the type of functional programming munging you are taking about. I wish more people knew about it (no one here seems to have mentioned it). And it has been Cython optimized in another package called cytoolz [1]. The author explains in the Heritage section how it is an extension of and plays nice with itertools and functools [2].

For Python coders new to functional programming, and how it can make working with data easier, I highly recommend reading the following sections of the pytoolz docs: Composability, Function Purity, Laziness, and Control Flow [0].

[0] https://toolz.readthedocs.io/en/latest/

[1] https://github.com/pytoolz/cytoolz

[2] https://toolz.readthedocs.io/en/latest/heritage.html

steve_gh · 4 years ago
Yes - the description of writing Python as like assembling Ikea furniture is absolutely spot-on. Yes, you can do it, and the results can be nice, but by God it is sometimes such a pain.

The comment showing where groipBy, sortBy etc can be found just shows the problem - they are all in different libraries.That's just plain annoying! And don't get me started on the pain of trying to build an Ordered Dictionary with a default initial value!

robertlagrant · 4 years ago
> And don't get me started on the pain of trying to build an Ordered Dictionary with a default initial value!

  >>> from collections import defaultdict
  >>> from datetime import datetime
  >>>
  >>> d = defaultdict(datetime.now)
  >>> d[1], d[2], d[3], d[4], d[0]
  (datetime.datetime(2021, 7, 9, 15, 50, 52, 87605), datetime.datetime(2021, 7, 9, 15, 50, 52, 87613), datetime.datetime(2021, 7, 9, 15, 50, 52, 87614), datetime.datetime(2021, 7, 9, 15, 50, 52, 87615), datetime.datetime(2021, 7, 9, 15, 50, 52, 87616))
  >>> for k,v in d.items():
  ...   print(k,v)
  ...
  1 2021-07-09 15:50:52.087605
  2 2021-07-09 15:50:52.087613
  3 2021-07-09 15:50:52.087614
  4 2021-07-09 15:50:52.087615
  0 2021-07-09 15:50:52.087616
  >>>
Seems pretty good to me?

mixmastamyk · 4 years ago
Like defaultdict? All dicts have been ordered since 3.6.
roenxi · 4 years ago
> ...like groupBy(fn), sortBy(fn), countBy(fn), collate...

What leaps out at me is that these 3 functions are all straight out of relational algebra style worldview.

Python the language doesn't support relational algebra as a first class concept. The reason it feels like IKEA self assembly is probably because you are implicitly implementing a data model that isn't how Python thinks about collections.

quietbritishjim · 4 years ago
Exactly. Pythonic versions:

    groupBy(fn): {fn(x): x for x in foo}
    sortBy(fn): sorted(foo, key=fn)
    countBy(fn): Counter(fn(x) for x in foo)
    flatten: [x for y in foo for x in y]
    filter(fn): [x for x in foo if fn(x)]
All the "for x in blah" can seem like a lot of boilerplate for a non-Pythonista, but actually it becomes subconscious once you're used to it and actually helps you feel the structure (a bit like indentation isn't necessary in C but it still helps you to see it).

For compound operations (e.g. merge two lists, filter and flatten), I find the code to be a lot more easier to "feel" than if you'd combined several functional style functions, where you have to read the function names instead of just seeing the structure.

globular-toast · 4 years ago
One of the harder things for beginners to grasp when learning a second or third programming language is that language is not just syntax. You see it all the time in Python in particular: people writing code like it's C or Java. It's no different to grabbing a French dictionary and transliterating your English sentence to French word by word. It just doesn't work and, in some cases, is completely wrong. Grokking a language means so much more than knowing the key words for doing all the stuff you used to do in your old language.
lwouis · 4 years ago
Most popular languages offer similar operations on collections these days. I've been using Java, Scala, JS/TS, purescript, Ruby, C++, Rust in the past, and these days all have a similar offering to iterate over collections using idiomatic functions such as map/filter/sort/etc.

I'm working on some python these days and I find quite unpleasant the 1-line lambdas, and general messy options to implement common collections operations.

BurningFrog · 4 years ago
It will take Python 10 more years to allow a flatten()...
joshuamorton · 4 years ago
Do you need the recursive form? Because a single flatten is itertools.chain.from_iterable
staticautomatic · 4 years ago
more_itertools ftw
masklinn · 4 years ago
10 years is very optimistic: flatten has little use without a functional pipeline (something like clojure’s threading macro) as in other contexts it’s a trivial comprehension.
goodside · 4 years ago
I think the motivation there historically was to encourage use of generators and generator expressions — it’s definitely intentional, because things like `apply` and `reduce` used to be builtins but now they’re imports.

If you reach for itertools imports often in an interactive REPL, you might be interested in Pyflyby: https://labs.quansight.org/blog/2021/07/pyflyby-improving-ef...

publicola1990 · 4 years ago
Also many python std library collection methods modify their inputs.
emmelaich · 4 years ago
Some of those are given as examples in the functools / itertools doc pages.
gsinclair · 4 years ago
And it's simply amazing that functools doesn't fill this gap.
frogcoder · 4 years ago
Site is down. I believe this is the same article by the author.

https://towardsdatascience.com/functools-the-power-of-higher...

gorgoiler · 4 years ago
Heh, a younger me would definitely be walking around with a black T-shirt with:

  from functions import *
…on the front and

  @cached_property
  def troll_face(self):
    return ‘:)’
…on the back.

functools and itertools are amazing and I love them both. They are especially useful for teaching high school CS without having to stray from Python, which the kids at all levels know well and are comfortable with.

However.

Using @cached_property feels like a bad code smell and it’s a controversial design decision. The example given is more like an instance of a factory? Perhaps if the result of the render function was an Important Object™ (not just a mere string) then the function’s callers might not be so cavalier about discarding the generated instance.

I would like to see the calling code that calls `render` more than once in two different places (hence the need for caching with no option for cache invalidation?). When every stack frame is a descendant of main() then there’s no such thing as “two different places”!

    def main():
      page = HN().render()
      while ‘too true’:
        do_work(page)
        tea_break(page)
Ironically it doesn’t feel very functional.

matsemann · 4 years ago
Because of the GIL the cache is often pretty useless on a webserver anyway, as each and every request gets a different process not sharing that cache. So then one has to deploy Redis in addition and complicate the stack. Makes sense with a shared cache when there's multiple instances, but often it's overkill. Having to go "external" to fetch the data I might as well not cache anything.
judofyr · 4 years ago
> each and every request gets a different process not sharing that cache

What web server is this? I’ve never heard about this behavior before.

sreeramb93 · 4 years ago
Exactly. How do these caches behave when python is used as a webservice in a multi-process multi-thread setting.

What if we have multiple containers running the same service and load balanced using an ingress controller?

Local Caches are a headache in such a scenario.

Best pattern I have read is use a cache as a sidecar in a pod. I feel that scales very well.

wbradley · 4 years ago
Please do not promote the use of wildcard imports.
TristanDaCunha · 4 years ago
> Using @cached_property feels like a bad code smell and it’s a controversial design decision

Why?

gorgoiler · 4 years ago
It’s just a smell. It hints that instead of passing the instance, one should instead be passing the output of the function.

It also doesn’t really matter. The only time things like this have actually been important was in giant Java codebases, where attempts were made to keep packages cleanly separated (and so therefore passing the simplest possible public types reduces the amount of public classes).

ionforce · 4 years ago
After having done Scala for a few years, it's finally a pleasure to be able to read and understand a thread like this.

FP gains are real. Anyone who tells you otherwise doesn't know what FP is.

To anyone out there who isn't on the FP train yet, get on. You'll become a better programmer for it.

Zababa · 4 years ago
That's one of the thing I love about JS. Sure, you don't have all the good static typing that comes with Scala or OCaml, however map, filter, reduce and lambdas are easily accessible. We may get record, tuples, pattern matching and a pipeline operator at some point!

Deleted Comment

kumarvvr · 4 years ago
I am delighted every time the Python Standard Library is mentioned. Always has a nice surprise.

The last time I was surprised was the itertools library.

lnyan · 4 years ago
I feel the same way. Last week, I was surprised that `pow` in Python is actually a modpow: `pow(base, exp[, mod])`.

Also check https://pymotw.com/3/. It's a great tour of the Python standard library

hashmush · 4 years ago
And since 3.8 the exponent can be negative, which lets you compute modular inverses without having to reimplement the extended Euclidean algorithm.

    >>> pow(5, -1, 13)
    8
    >>> 5 * 8 % 13
    1

magnio · 4 years ago
In case you haven't known this, check out data classes. It is so obvious yet so useful that I don't want to write Python lower than 3.7 anymore.
nerdponx · 4 years ago
Worthy of note: https://attrs.org, which inspired dataclasses and has more features.
nwomack · 4 years ago
Yes I agree, I also use typing.NamedTuple a lot
de_keyboard · 4 years ago
Functional programming in Python sucks for a few reasons:

- Lack of pipe operator

- Multiple arguments everywhere instead of currying

- No do-notation or equivalent

- Reference equality rather than structural equality for objects etc.

If you want to program in this style, consider using F#, R or even JavaScript with a few Babel plugins.

henearkr · 4 years ago
Which Babel plugins you would recommend?
de_keyboard · 4 years ago
ncc-erik · 4 years ago
Big shoutout to lru_cache. I tossed in two lines of code and was able to get a 60x speedup in my code by reducing the amount of regular expression compilations I had to do.