Readit News logoReadit News
kbd · 7 years ago
Despite controversy, walrus operator is going to be like f-strings. Before: "Why do we need another way to..." After: "Hey this is great".

People are wtf-ing a bit about the positional-only parameters, but I view that as just a consistency change. It's a way to write in pure Python something that was previously only possible to say using the C api.

stefco_ · 7 years ago
f-strings are the first truly-pretty way to do string formatting in python, and the best thing is that they avoid all of the shortcomings of other interpolation syntaxes I've worked with. It's one of those magical features that just lets you do exactly what you want without putting any thought at all into it.

Digression on the old way's shortcomings: Probably the most annoying thing about the old "format" syntax was for writing error messages with parameters dynamically formatted in. I've written ugly string literals for verbose, helpful error messages with the old syntax, and it was truly awful. The long length of calls to "format" is what screws up your indentation, which then screws up the literals (or forces you to spread them over 3x as many lines as you would otherwise). It was so bad that the format operator was more readable. If `str.dedent` was a thing it would be less annoying thanks to multi-line strings, but even that is just a kludge. A big part of the issue is whitespace/string concatenation, which, I know, can be fixed with an autoformatter [0]. Autoformatters are great for munging literals (and diff reduction/style enforcement), sure, but if you have to mung literals tens of times in a reasonably-written module, there's something very wrong with the feature that's forcing that behavior. So, again: f-strings have saved me a ton of tedium.

[0] https://github.com/python/black

lewiscollard · 7 years ago
For me, the hugs thing about f-strings was that invalid string format characters become a compile time error (SyntaxError).

  print('I do not get executed :)')
  f'{!}'


  File "stefco.py", line 2
    f'{!}'
    ^
  SyntaxError: f-string: empty expression not allowed
This has the pleasing characteristic of eliminating an entire class of bug. :)

bulatb · 7 years ago
> If `str.dedent` was a thing

Have you looked at textwrap.dedent?

Deleted Comment

choppaface · 7 years ago
Was the walrus operator really worth "The PEP 572 mess"? https://lwn.net/Articles/757713/

That post makes a few things very clear:

* The argument over the feature did not establish an explicit measure of efficacy for the feature. The discussion struggled to even find relevant non-Toy code examples.

* The communication over the feature was almost entirely over email, even when it got extremely contentious. There was later some face-to-face talk at the summit.

* Guido stepped down.

contravariant · 7 years ago
It may not have been a fair trade, but then it wasn't a trade in the first place. Those all seem to be problems with the process itself, meaning that it could have happened any time a contentious feature came up, this just happened to be the one to trigger the problem.
heydenberk · 7 years ago
I'd used f-string-like syntaxes in other languages before they came to Python. It was immediately obvious to me what the benefit would be.

I've used assignment expressions in other languages too! Python's version doesn't suffer from the JavaScript problem whereby equality and assignment are just a typo apart in, eg., the condition of your while loop. Nonetheless, I find that it ranges from marginally beneficial to marginally confusing in practice.

fny · 7 years ago
I love string interpolation! But this seems to take it to a bizarre level place just to save a few keystrokes. Seriously, how is f"{now=!s}" substantially better than f"now={str(now)}"?

Ergonomically, I see little benefit for the added complexity.

Deleted Comment

agumonkey · 7 years ago
I'm trying to write an elisp macro because yes.. f-strings are too convenient
Scarblac · 7 years ago
By itself I agree, every now and then you write a few lines that will be made a little shorter now that := exists.

But there's a long standing trend of adding more and more of these small features to what was quite a clean and small language. It's becoming more complicated, backwards compatibility suffers, the likelyhood your coworker uses some construct that you never use increases, there is more to know about Python.

Like f-strings, they are neat I guess. But we already had both % and .format(). Python is becoming messy.

I doubt this is worth that.

vorticalbox · 7 years ago
they should pick f and put depreciation warnings on the other two, python is getting messy.
ehsankia · 7 years ago
Was the controversy really about the need for the feature? I thought most people agreed it was a great feature to have, and most of the arguments were about `:=` vs re-using `as` for the operator.
pmart123 · 7 years ago
I like "as" instead. I didn't realize that was on the table. To me, it seems more Pythonic given the typical English-like Python syntax of "with open(path) as file", "for element in items if element not in things", etc.
linsomniac · 7 years ago
I don't know in this case, but I do know that the Python community tends to have strong opinions about things. The := resulted in Guido stepping down, which I think is a good indicator that there wasn't agreement that it was "a great feature to have" and just down to syntax... :-(
kbd · 7 years ago
All discussion I've ever seen was about the need for the feature, not its spelling. I didn't even know "as" was proposed, but in fact it is an "alternate spelling" they considered[1] in the PEP.

[1] https://www.python.org/dev/peps/pep-0572/#alternative-spelli...

linsomniac · 7 years ago
I've literally been wanting something like the walrus operator since I first started using Python in '97. Mostly for the "m = re.match(x, y); if m: do_something()" sort of syntax.
dual_basis · 7 years ago
I mean, that isolated example doesn't really demonstrate the benefit of a walrus operator does it? You could have just written "if re.match(x, y): do_something()". If you re-used the result of computation within the if statement, I feel that would be a better example, eg. "m = re.match(x, y); if m: do_something(m)".
adito · 7 years ago
I wonder if the controversial Go's error check function "try" proposal[0] will also be similar to this situation.

[0]: https://github.com/golang/go/issues/32437

chasontherobot · 7 years ago
That was already cancelled.

Deleted Comment

brummm · 7 years ago
I think in certain situations the walrus operator will probably be useful. But it definitely makes code less legible, which makes me cautious. The only useful use case I have found so far is list comprehensions where some function evaluation could be reduced to only one execution with the walrus operator.
kbd · 7 years ago
> But it definitely makes code less legible, which makes me cautious.

Disagree. In cases where it's useful it can make the code much clearer. Just yesterday I wrote code of the form:

    foos = []
    foo = func(a,b,c,d)
    while foo:
       foos.append(foo)
       foo = func(a,b,c,d)
With the walrus operator, that would just be:

    foos = []
    while foo := func(a,b,c,d):
        foos.append(foo)
Further, I had to pull out 'func' into a function in the first place so I wouldn't have something complicated repeated twice, so it would remove the need for that function as well.

craigds · 7 years ago
Yep, I can't wait to use the walrus operator. I just tried it out (`docker run -it python:3.8.0b2-slim`) and I'm hooked already.

Also, it's times like these I'm really glad docker exists. Trying that out before docker would have been a way bigger drama

voldacar · 7 years ago
Python looks more and more foreign with each release. I'm not sure what happened after 3.3 but it seems like the whole philosophy of "pythonic", emphasizing simplicity, readability and "only one straightforward way to do it" is rapidly disappearing.
teddyh · 7 years ago
“I've come up with a set of rules that describe our reactions to technologies:

1. Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.

2. Anything that's invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.

3. Anything invented after you're thirty-five is against the natural order of things.”

― Douglas Adams, The Salmon of Doubt

stakhanov · 7 years ago
It's wrong to frame this as resistance to change for no reason. See my other comment. I see some of this stuff as repeating mistakes that were made in the design of Perl. ...but there are quite few people around these days who know Perl well enough to recognize the way in which history is repeating itself, and that has at least something to do with age.
bacon_waffle · 7 years ago
The point #1 is expanded on in Feral by George Monbiot. Basically, we have a tendency to see the outside world we grew up with as the way things naturally should be, ignoring that previous generations may have changed it to be that way. That sheep-grazed pastoral landscape is easy to view as a thing worth preserving, but to an ecologist it might be a barren waste where there used to be a beautiful forest.
tialaramex · 7 years ago
Forewarned is forearmed. I headed into adulthood watching out for such mirages. For example: Making sure to listen to pop music enough that it does exactly what pop music is supposed to do (worm its way into your subconscious) so I don't wake up one morning unaccountably believing that Kylie Minogue was good but Taylor Swift isn't.

My understanding of Python will probably never be quite as good as my understanding of C, but I can live with that.

mehrdadn · 7 years ago
I don't think it's just >35-year-olds who find what's going on in Python against the natural order of things?
voldacar · 7 years ago
No, those aren't really the reasons for my reaction. And if I told you my age, you would probably switch your argument and say that I'm far too young to criticize ;)
dual_basis · 7 years ago
I am an example which supports this notion. I've done some Python programming about 10 years ago but then took a break from programming altogether for the last 9 years. Last year I got back into it and have been using Python 3.7, and I personally love all the most recent stuff. I hate having to go back to 3.5 or even 3.6, and I end up pulling in stuff from futures.
throw2016 · 7 years ago
This 'resistance to change' catchall argument puts everything beyond criticism, and it can be used/abused in every case of criticism. It seeks to reframe 'change' from a neutral word - change can be good or bad - to a positive instead of focusing on the specifics.

Anyone making this argument should be prepared to to accept every single criticism they make in their life moving forward can be framed as 'their resistance to change'.

This kind of personalization of specific criticism is disingenuous and political and has usually been used as a PR strategy to push through unpopular decisions. Better to respond to specific criticisms than reach for a generic emotional argument that seeks to delegitimize scrutiny and criticism.

whatshisface · 7 years ago
Does that mean someone born in 2008 will think C++ is simple and elegant?
cameronbrown · 7 years ago
Except that Python existed before I was born and I still appreciate the concept of 'Pythonic'. The language should stay true to its roots.
ineedasername · 7 years ago
* Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.*

Yep, I entered the Python world with v2. I eventually reconciled myself to 2.7, and have only recently and begrudgingly embraced 3. Being over 35, I must be incredibly open minded on these things.

nerdponx · 7 years ago
Can you give an example of something like this happening to the language? IMO 3.6+ brought many positive additions to the language, which I also think are needed as its audience grows and its use cases expand accordingly.

The walrus operator makes while loops easier to read, write and reason about.

Type annotations were a necessary and IMO delightful addition to the language as people started writing bigger production code bases in Python.

Data classes solve a lot of problems, although with the existence of the attrs library I'm not sure we needed them in the standard library as well.

Async maybe was poorly designed, but I certainly wouldn't complain about its existence in the language.

F strings are %-based interpolation done right, and the sooner the latter are relegated to "backward compatibility only" status the better. They are also more visually consistent with format strings.

Positional-only arguments have always been in the language; now users can actually use this feature without writing C code.

All of the stuff feels very Pythonic to me. Maybe I would have preferred "do/while" instead of the walrus but I'm not going to obsess over one operator.

So what else is there to complain about? Dictionary comprehension? I don't see added complexity here, I see a few specific tools that make the language more expressive, and that you are free to ignore in your own projects if they aren't to your taste.

masklinn · 7 years ago
> F strings are %-based interpolation done right, and the sooner the latter are relegated to "backward compatibility only" status the better. They are also more visually consistent with format strings.

No, f-strings handle a subset of %-based interpolation. They're nice and convenient but e.g. completely unusable for translatable resources (so is str.format incidentally).

ptx · 7 years ago
What would "do/while" look like in Python? Since blocks don't have end markers (e.g. "end", "}", etc.) there's nowhere to put the while expression if you want the syntax to be consistent with the rest of the language.
sametmax · 7 years ago
Most code still look like traditional Python. Just like meta programming or monkey patching, the new features are used sparingly by the community. Even the less controversial type hints are here on maybe 10 percent of the code out there.

It's all about the culture. And Python culture has been protecting us from abuses for 20 years, while allowing to have cool toys.

Besides, in that release (and even the previous one), appart from the walrus operator that I predict will be used with moderation, I don't see any alien looking stuff. This kind of evolution speed is quite conservative IMO.

Whatever you do, there there always will be people complaining I guess. After all, I also hear all the time that Python doesn't change fast enough, or lack some black magic from functional languages.

wyldfire · 7 years ago
> Even the less controversial type hints are here on maybe 10 percent of the code out there.

I think this metric is grossly overestimated. Or your scope for "out there" is considering some smaller subset of python code than what I'm imagining.

I think the evolution of the language is a great thing and I like the idea of the type hints too. But I don't think most folks capitalize on this yet.

dudul · 7 years ago
What's an example of black magic from functional languages?
llukas · 7 years ago
If you complained more specifically it would be possible to discuss. For what was described in article I don't see anything "foreign". Python was always about increasing code readability and those improvements are aligning well with this philosophy.
baq · 7 years ago
i've been hearing this since 1.5 => 2.0 (list comprehensions), then 2.2 (new object model), 2.4 (decorators)...

happy python programmer since 1.5, currently maintaining a code base in 3.7, happy about 3.8.

spamizbad · 7 years ago
I cut my teeth on 2.2-2.4 and remember getting my hand slapped when 2.4 landed and I used a decorator for the first time.

It was to allow only certain HTTP verbs on a controller function. A pattern adopted by most Python web frameworks today.

runxel · 7 years ago
That's especially funny given how everybody screams "that's not pythonic!!1!" nowadays when somebody does _not_ use a list comprehension...
Razengan · 7 years ago
The '*' and '/' in function parameter lists for positional/keyword arguments look particularly ugly and unintuitive to me. More magic symbols to memorize or look up.
riffraff · 7 years ago
I also cannot honestly think of a case where I want that behaviour.

The "pow" example looks more like a case where the C side should be fixed.

jnwatson · 7 years ago
Beyond the older-than-35 reason, I think a lot of folks are used to the rate of new features because there was a 5 year period where everyone was on 2.7 while the new stuff landed in 3.x, and 3.x wasn't ready for deployment.

In reality, the 2.x releases had a lot of significant changes. Of the top of my head, context managers, a new OOP/multiple inheritance model, and division operator changes, and lots of new modules.

It sucks that one's language is on the upgrade treadmill like everything else, but language design is hard, and we keep coming up with new cool things to put in it.

I don't know about Python 3.8, but Python 3.7 is absolutely amazing. It is the result of 2 decades of slogging along, improving bit by bit, and I hope that continues.

LaGrange · 7 years ago
In my experience, every technology focused on building a "simple" alternative to a long-established "complex" technology is doomed to discover exactly _why_ the other one became "complex." Also spawn at least five "simple" alternatives.

Doesn't mean nothing good comes out of them, and if it's simplicity that motivates people then eh, I'll take it, but gosh darn the cycle is a bit grating by now.

orhmeh09 · 7 years ago
Could you provide some examples? Without having had that experience, I’m having trouble picturing a concrete example that I would be sure is of the same kind.
newen · 7 years ago
Haha, what was that quote? Something like, any language is going to iterate towards a crappy version of lisp.
amedvednikov · 7 years ago
I'm working on a language with a focus on simplicity and "only one way to do it": https://vlang.io

The development has been going quite well:

https://github.com/vlang/v/blob/master/CHANGELOG.md

noname120 · 7 years ago
Really interesting. For the skeptics, this is not just a proof of concept. There is a real app made using this language: https://volt-app.com/
jamesb93 · 7 years ago
This is great! Thanks for your work. Can V be integrated into existing c++ projects? I work in audio and constantly working in c++ is tiring. I'd love to work in something like V and transpile down.
flavio81 · 7 years ago
>I'm working on a language with a focus on simplicity and "only one way to do it":

If I wanted a language with "only one way to do it", i'd use Brainfuck. Which, btw, is very easy to learn, well documented, and the same source code runs on many, many platforms.

JustSomeNobody · 7 years ago
I see what you're saying, but I kinda like the gets ":=" operator.
simondw · 7 years ago
But now there are two ways to do assignment. That's not very pythonic, is it?
tjpnz · 7 years ago
I don't think that philosophy was ever truly embraced to begin with. If you want evidence of that try reading the standard library (the older the better) and then try running the code through a linter.
chc · 7 years ago
The idea that str.format produced simpler or more readable code than f-strings is contrary to the experience of most Python users I know. Similarly, the contortions we have to go through in order to work around the lack of assignment expressions are anything but readable.

I do agree that Python is moving further and further away from the only-one-way-to-do-it ethos, but on the other hand, Python has always emphasized practicality over principles.

pippy · 7 years ago
This is what happens when you lose a BDFL. While things become more "democratic", you lose the vision and start trying to make everyone happy.
mixmastamyk · 7 years ago
Walrus operator is the direct result of the BDFL pushing it over significant objection.
dlbucci · 7 years ago
Well, there were 4 versions released since 3.3 that still had a BDFL, so I dunno if that's the issue, yet.
hetman · 7 years ago
I'm someone who loves the new features even though I don't think they're "pythonic" in the classical meaning of this term. That makes me think that being pythonic at it's most base level is actually about making it easier to reason about your code... and on that count I have found most of the new features have really helped.
unethical_ban · 7 years ago
You can write very Python2.7 looking code with Python3. I don't think many syntax changes/deprecations have occurred (I know some have).
themeiguoren · 7 years ago
Yep, I did a 2to3 conversion recently and it got the whole project 95% of the way there. A 3to2 would be in theory almost as simple to do for most projects.
l0b0 · 7 years ago
My first though was the same as the snarky sibling comment, but after reading TFA I realized these are all features I've used in other languages and detest. The walrus operator an complex string formatting are both character-pinching anti-maintainability features.
raymondh · 7 years ago
To me, the headline feature for Python 3.8 is shared memory for multiprocessing (contributed by Davin Potts).

Some kinds of data can be passed back and forth between processes with near zero overhead (no pickling, sockets, or unpickling).

This significantly improves Python's story for taking advantage of multiple cores.

acqq · 7 years ago
For us who didn't follow:

"multiprocessing.shared_memory — Provides shared memory for direct access across processes"

https://docs.python.org/3.9/library/multiprocessing.shared_m...

And it has the example which "demonstrates a practical use of the SharedMemory class with NumPy arrays, accessing the same numpy.ndarray from two distinct Python shells."

Also, SharedMemory

"Creates a new shared memory block or attaches to an existing shared memory block. Each shared memory block is assigned a unique name. In this way, one process can create a shared memory block with a particular name and a different process can attach to that same shared memory block using that same name.

As a resource for sharing data across processes, shared memory blocks may outlive the original process that created them. When one process no longer needs access to a shared memory block that might still be needed by other processes, the close() method should be called. When a shared memory block is no longer needed by any process, the unlink() method should be called to ensure proper cleanup."

Really nice.

quietbritishjim · 7 years ago
It looks like this will make efficient data transfer much more convenient, but it's worth noting this had always been possible with some manual effort. Python has had `mmap` support at least as long ago as Python 2.7, which works fine for zero-overhead transfer of data.

With mmap you have to specify a file name (actually a file number), but so long as you set the length to zero before you close it there's no reason any data would get written to disk. On Unix you can even unlink the file before you start writing it if you wish, or create it with the tempfile module and never give it a file name at all (although this makes it harder to open in other processes as they can't then just mmap by file name). The mmap object satisfies the buffer protocol so you can create numpy arrays that directly reference the bytes in it. The memory-mapped data can be shared between processes regardless of whether they use the multiprocessing module or even whether they're all written in Python.

https://docs.python.org/3.7/library/mmap.html

mncharity · 7 years ago
Also on linux is sysv_ipc.SharedMemory.
agent008t · 7 years ago
Isn't that already the case?

I thought that when you use multiprocessing in Python, a new process gets forked, and while each new process has separate virtual memory, that virtual memory points to the same physical location until the process tries to write to it (i.e. copy-on-write)?

jashmatthews · 7 years ago
That's true but running VMs mutate their heaps, both managed and malloced. CoW also only works from parent to child. You can't share mutable memory this way.

Empty space in internal pages gets used allocating new objects, refence counts updated or GC flags get flipped etc, and it just takes one write in each 4kb page to trigger a whole page copy.

It doesn't take long before a busy web worker etc will cause a huge chunk of the memory to be copied into the child.

There are definitely ways to make it much more effective like this work by Instagram that went into Python 3.7: https://instagram-engineering.com/copy-on-write-friendly-pyt...

amelius · 7 years ago
Yes, the problem is sharing data between parent and child after the parent process has been forked.
paulddraper · 7 years ago
Yes, sharing pre-fork data is as old as fork().

Sharing post-fork data is where it gets interesting.

sametmax · 7 years ago
If you have 4 cores, you may want to spaw 4 children, then share stuff between them. Not just top-down.

E.G: live settings, cached values, white/black lists, etc

amelius · 7 years ago
> no pickling, sockets, or unpickling

But still copying?

If not, then how does it interoperate with garbage collection?

fulafel · 7 years ago
It works with the memoryview/buffer interface, so you can have eg a Numpy array backed by a sharedmemory object attached to a named SM segment.

So it's not for containing normal Python dicts, strings etc that are individually tracked by GC.

https://docs.python.org/3.8/library/multiprocessing.shared_m...

aportnoy · 7 years ago
I’ve been waiting for this for a very long time. Thank you for mentioning this.

Would this work with e.g. large NumPy arrays?

(and this is Raymond Hettinger himself, wow)

solarist · 7 years ago
An alternative you may want is Dask.
aidos · 7 years ago
Oh no way. That has huge potential. What are the limitations?
gigatexal · 7 years ago
Agreed this is huge.
londons_explore · 7 years ago
I long for a language which has a basic featureset, and then "freezes", and no longer adds any more language features.

You may continue working on the standard library, optimizing, etc. Just no new language features.

In my opinion, someone should be able to learn all of a language in a few days, including every corner case and oddity, and then understand any code.

If new language features get added over time, eventually you get to the case where there are obscure features everyone has to look up every time they use them.

vindarel · 7 years ago
Common Lisp seems to tick the boxes. The syntax is stable and it doesn't change. New syntax can be added through extensions (pattern matching, string interpolation, etc). The language is stable, meaning code written in pure CL still runs 20 years later. Then there are de-facto standard libraries (bordeaux-threads, lparallel,…) and other libraries. Implementations continue to be optimized (SBCL, CCL) and to develop core features (package-local-nicknames) and new implementations arise (Clasp, CL on LLVM, notably for bioinformatics). It's been rough at the beginning but a joy so far.

https://github.com/CodyReichert/awesome-cl

nine_k · 7 years ago
The "very compact, never changing" language will end up not quite expressive, thus prone to boilerplate; look at Go.

Lisps avoid this by building abstractions from the same material as the language itself. Basically no other language family has this property, though JavaScript and Kotlin, via different mechanisms, achieve something similar.

DonHopkins · 7 years ago
The Turing Machine programming language specification has been frozen for a long time, and it's easy to learn in a few days.

So has John von Neumann's 29 state cellular automata!

https://en.wikipedia.org/wiki/Von_Neumann_cellular_automaton

https://en.wikipedia.org/wiki/Von_Neumann_universal_construc...

(Actually there was a non-standard extension developed in 1995 to make signal crossing and other things easier, but other than that, it's a pretty stable programming language.)

>Renato Nobili and Umberto Pesavento published the first fully implemented self-reproducing cellular automaton in 1995, nearly fifty years after von Neumann's work. They used a 32-state cellular automaton instead of von Neumann's original 29-state specification, extending it to allow for easier signal-crossing, explicit memory function and a more compact design. They also published an implementation of a general constructor within the original 29-state CA but not one capable of complete replication - the configuration cannot duplicate its tape, nor can it trigger its offspring; the configuration can only construct.

mr_crankypants · 7 years ago
Such languages exist. Ones that come to mind offhand are: Standard ML, FORTH, Pascal, Prolog.

All of which are ones that I once thought were quite enjoyable to work in, and still think are well worth taking some time to learn. But I submit that the fact that none of them have really stood the test of time is, at the very least, highly suggestive. Perhaps we don't yet know all there is to know about what kinds of programming language constructs provide the best tooling for writing clean, readable, maintainable code, and languages that want to try and remain relevant will have to change with the times. Even Fortran gets an update every 5-10 years.

I also submit that, when you've got a multi-statement idiom that happens just all the time, there is value in pushing it into the language. That can actually be a bulwark against TMTOWTDI, because you've taken an idiom that everyone wants to put their own special spin on, or that they can occasionally goof up on, and turned it into something that the compiler can help you with. Java's try-with-resources is a great example of this, as are C#'s auto-properties. Both took a big swath of common bugs and virtually eliminated them from the codebases of people who were willing to adopt a new feature.

zephyrfalcon · 7 years ago
Prolog has an ISO standard... I am not sure if it's still evolving, but specific Prolog implementations can and often do add their own non-standard extensions. For example, SWI-Prolog added dictionaries and a non-standard (but very useful) string type in version 7.

That said, it is nice that I can take a Prolog text from the 1980s or 1990s and find that almost all of the code still works, with minor or no modifications...

mbo · 7 years ago
Elixir?

From the v1.9 release just a few weeks ago: https://elixir-lang.org/blog/2019/06/24/elixir-v1-9-0-releas...

> As mentioned earlier, releases was the last planned feature for Elixir. We don’t have any major user-facing feature in the works nor planned. I know for certain some will consider this fact the most excing part of this announcement!

> Of course, it does not mean that v1.9 is the last Elixir version. We will continue shipping new releases every 6 months with enhancements, bug fixes and improvements.

elgenie · 7 years ago
That's just an announcement that they reached the end of the list of user-facing syntax changes on their roadmap.
scribu · 7 years ago
Interesting!

I imagine churn will still happen, except it will be in the library/framework ecosystem around the language (think JavaScript fatigue).

poiuyt098 · 7 years ago
Brainfuck has been extremely stable. You can learn every operator in minutes.
nerdponx · 7 years ago
someone should be able to learn all of a language in a few days, including every corner case and oddity, and then understand any code.

Why should this be true for every language? Certainly we should have languages like this. But not every language needs to be like this.

esfandia · 7 years ago
Well, maybe not for every language, but probably for a language where simplicity has been a major feature.
arkaic · 7 years ago
Verrrrrrry few languages in common use are like this.
fatbird · 7 years ago
All you're doing then is moving the evolution of the language into the common libraries, community conventions, and tooling. Think of JavaScript before ES2015: it had stayed almost unchanged for more than a decade, and as a result, knowing JavaScript meant knowing JS and jQuery, prototype, underscore, various promise libraries, AMD/commonjs/require based module systems, followed by an explosion of "transpiled to vanilla JS" languages like coffeescript. The same happened with C decades earlier: while the core language in K&R C was small and understandable, you really weren't coding C unless you had a pile of libraries and approaches and compiler-specific macros and such.

Python, judged against JS, is almost sedate in its evolution.

It would be nice if a combination of language, libraries, and coding orthodoxy remained stable for more than a few years, but that's just not the technology landscape in which we work. Thanks, Internet.

elgenie · 7 years ago
It's apples and oranges.

Python was explicitly designed and had a dedicated BDFL for the vast majority of its nearly 30 year history functioning as a standards body.

JS, on the other hand, was hacked together in a week in the mid-90s and then the baseline implementation that could be relied on was emergent behavior at best, anarchy at worst for 15 years.

jnwatson · 7 years ago
The only frozen languages are the ones nobody uses except for play or academic purposes.

As soon as people start using a language, they see ways of improving it.

It isn't unlike spoken languages. Go learn Esperanto if you want to learn something that doesn't change.

BillChapman · 7 years ago
Esperanto does change, in that new items of vocabulary are introduced from time to time. For example, 'mojosa', the word for 'cool' is only about thirty years old.
colechristensen · 7 years ago
This is why a lot of scientific code still uses fortran, code written several decades ago still compiles and has the same output.

How long has the code which was transitioned to python lasted?

airstrike · 7 years ago
> How long has the code which was transitioned to python lasted?

A long time. 2to3 was good for ~90% of my code, at least

tjalfi · 7 years ago
I have compiled Fortran programs from the 70s on modern platforms without changing a line. The compiler, OS, and CPU architecture had all disappeared but the programs still worked correctly.
yxhuvud · 7 years ago
Fortran has added a whole lot of features over time though.
stefco_ · 7 years ago
This isn't that good of a metric for code utility. Sure, very-long-lived code probably solved the problem well (though it can also just be a first-mover kind of thing), but a lot of code is written to solve specific problems in a way that's not worth generalizing.

I write a lot of python for astrophysics. It has plenty of shortcomings, and much of what's written will not be useful 10 years from now due to changing APIs, architectures, etc., but that's partly by design: most of the problems I work on really are not suited to a hyper-optimized domain-specific languages like FORTRAN. We're actively figuring out what works best in the space, and shortcomings of python be damned, it's reasonably expressive while being adequately stable.

C/FORTRAN stability sounds fine and good until you want to solve a non-mathematical problem with your code or extend the old code in some non-trivial way. Humans haven't changed mathematical notations in centuries (since they've mostly proven efficient for their problem space), but even those don't always work well in adjacent math topics. The bra-ket notation of quantum mechanics, <a|b>, was a nice shorthand for representing quantum states and their linear products; Feynman diagrams are laughably simple pictograms of horrid integrals. I would say that those changes in notation reflected exciting developments that turned out to persist; so it is with programming languages, where notations/syntaxes that capture the problem space well become persistent features of future languages. Now, that doesn't mean you need to code in an "experimental" language, but if a new-ish problem hasn't been addressed well in more stable languages, you're probably better off going where the language/library devs are trying to address it. If you want your code to run in 40 years, use C/FORTRAN and write incremental improvements to fundamental algorithm implementations. If you want to solve problems right now that those langs are ill-suited to, though, then who cares how long the language specs (or your own code) last as long as they're stable enough to minimize breaking changes/maintenance? This applies to every non-ossified language: the hyper-long-term survival of the code is not the metric you should use (in most cases) when deciding how to write your code.

My point is just that short code lifetimes can be perfectly fine; they can even be markers of extreme innovation. This applies to fast-changing stuff like Julia and Rust (which I don't use for work because they're changing too quickly, and maintenance burdens are hence too high). But some of their innovative features will stand the test of time, and I'll either end up using them in future versions of older languages, or I'll end up using the exciting new languages when they've matured a bit.

hu3 · 7 years ago
From what I've seen, Go is the closest we have for mainstream language resistant to change.
zubspace · 7 years ago
Recently the Go team decided not to add the try-keyword to the language. I'm not a Go programmer and was a bit stumped by the decision until I saw a talk of Rob Pike regarding the fundamental principle of Go to stick to simplicity first. [1]

One of the takeaways is, that most languages and their features converge to a point, where each language contains all the features of the other languages. C++, Java and C# are primary examples. At the same time complexity increases.

Go is different, because of the simplicity first rule. It easens the burden on the programmer and on the maintainer. I think python would definitely profit from such a mindset.

[1] https://www.youtube.com/watch?v=rFejpH_tAHM

orangecat · 7 years ago
In my opinion, someone should be able to learn all of a language in a few days, including every corner case and oddity, and then understand any code.

"Understanding" what each individual line means is very different from understanding the code. There are always higher level concepts you need to recognize, and it's often better for languages to support those concepts directly rather than requiring developers to constantly reimplement them. Consider a Java class where you have to check dozens of lines of accessors and equals and hashCode to verify that it's an immutable value object, compared to "data class" in Kotlin or @dataclass in Python.

toolslive · 7 years ago
Sometimes a language introduces a concept that's new to you. Then you need way more time. For example, monads : I understood it (the concept) rather quickly, but it took a few weeks to get it down so I could benefit from it.
orwin · 7 years ago
Try C maybe? It is still updated, but only really minor tweaks for optimisation.

Also Common lisp specs never changed since the 90s and is still usefull as a "quick and dirty" language, with few basic knowledge required. But the "basic feature set" can make everything, so the "understand any code" is not really respected. Maybe Clojure is easier to understand (and also has a more limited base feature set, with no CLOS).

kazinator · 7 years ago
C compilers like GCC and Clang have dialect selection options that work; if you pick -std=c90, you can write C like it's 1990.
baq · 7 years ago
remember the gang of four book? such books happen when the language is incapable of expressing ideas concisely. complexity gets pushed to libraries which you have to understand anyway. i'd rather have syntax for the visitor pattern or whatever else is there.
markrages · 7 years ago
Python 2.7 is not far from that language.
tomkat0789 · 7 years ago
What's stopping people from forking the language at python 2.7? Let the pythonistas add whatever feature they feel like while people who need stability use "Fortran python" or whatever.
plopz · 7 years ago
Isn't that what C is?
FPGAhacker · 7 years ago
Certainly Common Lisp.
alexhutcheson · 7 years ago
Lua is pretty close, and pretty close to Python in terms of style and strengths.

Edit: I actually forgot about the split between LuaJIT (which hasn’t changed since Lua 5.1), and the PUC Lua implementation, which has continued to evolve. I was thinking of the LuaJIT version.

linsomniac · 7 years ago
I'm in operations and I've spent much of my career writing code for the Python that worked on the oldest LTS release in my fleet, and for a very long time that was Python 1.5...

I was really happy, in some ways, when Python 2 was announced as getting no new releases and Python 3 wasn't ready, because it allowed a kind of unification of everyone on Python 2.7.

Now we're back on the treadmill of chasing the latest and greatest. I was kind of annoyed when I found I couldn't run Black to format my code because it required a slightly newer Python than I had. But... f strings and walrus are kind of worth it.

owaislone · 7 years ago
That's what Go has been so far but it might see some changes soon after being "frozen" for ~10 years.
diminoten · 7 years ago
Why can't you do this with Python? No one said you had to use any of these new features...

Though to me that's like saying, "I want this river to stop flowing" or "I'd prefer if the seasons didn't change."

dwaltrip · 7 years ago
All human languages change over time. It is the nature of language.
chewxy · 7 years ago
Go? I moved a lot of my datascience and machine learning process to Go. Only thing really left in Python land is EDA
Areading314 · 7 years ago
Absolutely agree. How many times have you heard "that was true until Python 3.4 but now is no longer an issue" or "that expression is illegal for all Pythons below 3.3", and so on. Not to mention the (ongoing) Python 2->3 debacle.
TomBombadildoze · 7 years ago
> Not to mention the (ongoing) Python 2->3 debacle.

When will this talking point die? It's not "ongoing". There's an overwhelming majority who have adopted Python 3 and a small population of laggards.

orf · 7 years ago
Who cares about syntax that doesn’t work in old, dead versions of Python 3? 3.5 and above is all that matters.
coleifer · 7 years ago
Lua is a great small language.

Dead Comment

Dead Comment

stakhanov · 7 years ago
Speaking as someone who has written Python code almost every day for the last 16 years of my life: I'm not happy about this.

Some of this stuff seems to me like it's opening the doors for some antipatterns that I'm consistently frustrated about when working with Perl code (that I didn't write myself). I had always been quite happy about the fact that Python didn't have language features to blur the lines between what's code vs what's string literals and what's a statement vs what's an expression.

sametmax · 7 years ago
F-strings have appeared 2 versions ago. All in all, the feedback we have has been overwhelmingly positive, including on maintenance and readability.
theplague42 · 7 years ago
I second this. F-strings make string formatting so much more concise. I'm excited about the walrus operator for the same reason.
sleavey · 7 years ago
I love f-strings. I just wish tools like pylint would shut up when I pass f-strings to the logging module. I as the developer understand and accept the extra nanosecond of processor time to parse the string that might not be logged anywhere!
pmontra · 7 years ago
F-string are great and should have been in the language since the beginning. Many other languages had with their own version of them since version 0. What I don't understand is why Python needs a special string type when other languages can interpolate normal strings (Ruby, Elixir, JavaScript.)
stakhanov · 7 years ago
...but every addition to make them more powerful and feature-rich is one more step in the direction of blurring the lines between what's code and what isn't, since more and more things that are supposed to be code will be expressed in ways that aren't code at all but fed to an interpreter inside the interpreter. And with every release, the language specification that I'm having to hold in my head when dealing with other people's code grows more and more complex while the cost-benefit calculation around the additional complexity shows diminishing returns.

It kind of goes to the question: When is a language "finished"?

hk__2 · 7 years ago
As someone who has written Python code almost every day for both professional and personal projects for a few years: I’m really happy about these assignment expressions. I wish Python would have more expressions and fewer statements, like functional languages.
duckerude · 7 years ago
Do you have an example of bad code you'd expect people to use assignment expressions and f-strings for?

I don't think I've come across any f-string abuse in the wild so far, and my tentative impression is that there's a few patterns that are improved by assignment expressions and little temptation to use them for evil.

It helps that the iteration protocol is deeply ingrained in the language. A lot of code that could use assignment expressions in principle already has a for loop as the equally compact established idiom.

ashton314 · 7 years ago
Many languages don't distinguish between statements and expressions—in some languages, this is because everything is an expression! I'm most familiar with these kinds of languages.

I'm not familiar much with Python, beyond a little I wrote in my linear algebra class. How much does the statement/literal distinction matter to readability? What does that do for the language?

stakhanov · 7 years ago
The philosophy that most of Python's language design is based on is that for everything you want to do, there should be one and only one obvious way to do it.

The first part of the statement (at least one obvious way to do it) goes to gaining a lot of expressive power from having learned only a subset of the language specification corresponding to the most important concepts. So you invest only a small amount of time in wrapping your head around only the most important/basic language concepts and immediately gain the power that you can take any thought and express it in the language and end up not just with some way of doing it, but with the right/preferred way of doing it.

The second part of the statement (at most one obvious way to do it) makes it easy to induce the principles behind the language from reading the code. If you take a problem like "iterate through a list of strings, and print each one", and it always always always takes shape in code by writing "for line in lst: print( line )" it means that, if it's an important pattern, then a langauge learner will get exposed to this pattern early and often when they start working with the language, so has a chance to quickly induce what the concept is and easily/quickly memorize it due to all the repetition. -- Perl shows how not to do it, where there are about a dozen ways of doing this that all end up capable of being expressed in a line or two. -- Therefore, trying to learn Perl by working with a codebase that a dozen people have had their hands on, each one preferring a different variation, makes it difficult to learn the language, because you will now need to know all 12 variations to be able to read Perl reliably, and you will only see each one 1/12th as often making it harder to memorize.

nerdponx · 7 years ago
The only reason I can imagine being opposed to it is fear that hordes of bad programmers will descend on the language and litter the ecosystem with unreadable golfed garbage.

I obviously don't want that. I don't think anybody wants that. But I also don't think that's going to happen as a result of the recent changes in the language. If anything, I feel like the average code quality in the wild has gone up.

ezrast · 7 years ago
It's natural for some operations to be used only for their side effects, and for those a return value is just noise. What does a while loop evaluate to in your favorite language? Are there any circumstances where you'd want to assign one to a variable? What do you lose by making that a parser error?
keymone · 7 years ago
> what's a statement vs what's an expression

never understood the need for this. why do you even need statements?

if there's one thing that annoys me in python it's that it has statements. worst programming language feature ever.

stefco_ · 7 years ago
There's a lot of talk in this thread about Python going down-hill and becoming less obvious/simple. I rather like modern python, but I agree that some features (like async/await, whose implementation fractures functions and libraries into two colors [0]) seem like downgrades in "Pythonicity".

That said, I think some things have unquestionably gotten more "Pythonic" with time, and the := operator is one of those. In contrast, this early Python feature (mentioned in an article [1] linked in the main one) strikes me as almost comically unfriendly to new programmers:

> Python vowed to solve [the problem of accidentally assigning instead of comparing variables] in a different way. The original Python had a single "=" for both assignment and equality testing, as Tim Peters recently reminded him, but it used a different syntactic distinction to ensure that the C problem could not occur.

If you're just learning to program and know nothing about the distinction between an expression and a statement, this is about as confusing as shell expansion (another context-dependent syntax). It's way too clever to be Pythonic. The new syntax, though it adds an extra symbol to learn, is at least 100% explicit.

I'll add that := fixes something I truly hate: the lack of `do until` in Python, which strikes me as deeply un-Pythonic. Am I supposed to break out of `while True`? Am I supposed to set the variable before and at the tail of the loop (a great way to add subtle typos that will cause errors)? I think it also introduces a slippery slope to be encouraged to repeat yourself: if assigning the loop variable happens twice, you might decide to do something funny the 2:Nth time to avoid writing another loop, and that subtlety in loop variable assignment can be very easy to miss when reading code. There is no general solution I've seen to this prior to :=. Now, you can write something like `while line := f.readline()` and avoid repetition. I'm very happy to see this.

[0] https://journal.stuffwithstuff.com/2015/02/01/what-color-is-...

[1] https://lwn.net/Articles/757713/

[edit] fixed typos

Asooka · 7 years ago
You are supposed to write

    for x in iter(f.readline, ""):
Or if you don't know what readline will return you can wrap it in your own lambda:

    for x in iter(lambda:f.readline() or None, None):
There is a lot you can do with iter to write the kind of loops you want but it's not well known for some reason. It's a very basic part of the language people seem to overlook. Walrus does however let you write the slightly more useful

    while predicate(x:=whatever()):
Which doesn't decompose easily into iter form.

stefco_ · 7 years ago
This is a good solution! I don't directly use `iter` very often so I only remember it's simplicity part of the time. Sadly, this is not the idiom I see in most places.

I will say, though, that I was not comfortable using iterators when I first learned python; walrus strikes me as easier to grok for a novice (one of the ostensible Python target demographics) than iter. I'll bet this is why this simple form is not idiomatic (though you're right, it should be).

owlowlowls · 7 years ago
>I'll add that := fixes something I truly hate: the lack of `do until` in Python, which strikes me as deeply un-Pythonic. Am I supposed to break out of `while True`? Am I supposed to set the variable before and at the tail of the loop (a great way to add subtle typos that won't cause errors)?

This is relevant to what I've been doing in OpenCV with reading frames from videos! In tutorial examples on the web, you'll see exactly the sort of pattern that's outlined in the PEP 572 article.

>line = f.readline()

>while line:

> ... # process line

> line = f.readline()

Just, replace readline() with readframe() and the like. So many off-by-one errors figuring out when exactly to break.

Aramgutang · 7 years ago
That example can also be tackled with Python's little-known feature of calling the built-in `iter` with a second argument:

> for line in iter(f.readline, ''):

> ... # process line

See: https://docs.python.org/3/library/functions.html#iter

kbd · 7 years ago
For sure. Walrus operator is "unnecessary" but is a clear improvement in the code that will use it.
thomasahle · 7 years ago
The problem with `while line := f.readline():` is that it takes preasure of library writers. You should really just do `for line in f:`. If the library only has a `next` function, it needs to be fixed.
masklinn · 7 years ago
`f` might be iterable / iterator with completely different semantics than doing by-line iteration. And that might even be a good idea.
vesche · 7 years ago
Was really hoping to see multi-core in 3.8, looks like we'll be waiting until 3.9

https://www.python.org/dev/peps/pep-0554/

https://github.com/ericsnowcurrently/multi-core-python/wiki

Stubb · 7 years ago
A map() function that isn't just an iterated fork() would be glorious. Let me launch a thread team like in OpenMP to tackle map() calls containing SciPy routines and I'll be unreasonably happy.
sleavey · 7 years ago
Without wanting to ignite a debate about the walrus operator (and having not read any of the arguments), I can guess why there was one. It's not clear to me what it does just from reading it, which was always one of Python's beginner-friendlinesses.
coldtea · 7 years ago
>It's not clear to me what it does just from reading it

How isn't it entirely obvious? := is the assignment operator in tons of languages, and there's no reason not to have assignment be an expression (as is also the case in many languages).

txcwpalpha · 7 years ago
> := is the assignment operator in tons of languages

It is? Which ones? Other than Go, I can not think of a single language that has ":=" as an operator. Java does not, JavaScript does not, C/C++ do not, Ruby does not, I don't think PHP does, Erlang/Elixir do not, Rust does not... (I could be wrong on these, but I've personally never seen it in any of these languages and I can't find any mention of it in these languages' docs).

I tried looking around the internet at various popular programming languages and the only ones I could find that use ":=" are: Pascal, Haskell (but it's used for something else than what Python uses it for), Perl (also used for something else), and Scala (but in Scala it isn't officially documented and doesn't have an 'official' use case).

I don't have a strong opinion about ":=" in Python but I do agree that it's unintuitive and thus not very "Pythonic".

queensnake · 7 years ago
It looks to me like it could be an assignment to const, or, a copy vs a non-copy - it’s not obvious at all. I’m sure: ‘?=‘ was fought over and rejected, but that’s what I’d have expected conditional assignment to look like.
sleavey · 7 years ago
It's not in a language I've ever used (furthermore, I explicitly mentioned beginners in my comment).