My impression is that in the ‘00s, Python and Ruby were both relatively new, dynamically typed, “English-like” languages. And for a while these languages had similar popularity.
Now Ruby is still very much alive; there are plenty of Rails jobs available and exciting things happening with Ruby itself. But Python has become a titan in the last ten years. It has continued to grow exponentially and Ruby has not.
I can guess as to why (Python’s math libraries, numpy and pandas make it appealing to academics; Python is simpler and possibly easier to learn; Rails was so popular that it was synonymous with Ruby) but I wasn’t paying attention at that time. So I’m interested in hearing from some of the older programmers about why Ruby has stalled out and Python has become possibly the most popular programming language (when, in my opinion, Ruby is the better language).
Ruby ended up 'specializing' in web dev, because of Rails. But when Node and React came out, Ruby on Rails had to compete with Nodejs + React / MERN as a way of building a web app. Since people first learning programming to build a web app would usually start with javascript anyway (since a lot of very first projects might not even need a backend), it was a lot easier for the Nodejs/React route to become the default path. Whereas if you were a data scientist, you started on python, and as you got better, you basically just kept using python.
I think a lot of people were attracted to the language design, as captured in the Zen of Python (https://peps.python.org/pep-0020/), such as:
Explicit is better than implicit.
Readability counts.
Errors should never pass silently (unless explicitly silenced)
There should be one-- and preferably only one --obvious way to do it.
In many cases, Ruby has almost the opposite philosophy. There's nothing wrong with that - but I think a lot of people prefer Python's choices.
This is so hilariously wrong in python though
That matches my memory as well. I can't find any references, but I seem to remember a quote going around way back circa 2008 that goes something like "Ruby is popular because it's the language used to write Rails; Django is popular because it's written in Python."
Monkeypatching is also awful for readability.
Explicit imports are way more readable than things appearing into current namespace implicit kind of stuff like it happens with Ruby
Also, the batteries included in the standard libs was incredible when you needed to use the ACE/TAO monster in C++.
Finally, interfacing with native code via SWIG enables you to quickly use critical optimized code.
Obviously, other programming languages captured Python power since then.
People seem to forget or miss that it took many, many years for Python to become popular outside data science. If it had just been clearly better or easier that transition should have gone much faster.
As someone else mentioned, at one point universities started using Python as an introduction language. I am still sad that they did not choose Ruby or js but here we are.
Personally I think Python is an exceptionally ugly language for one as popular as it is (the magical underscore identifiers really bug me, and I think list comprehensions are deeply inferior to monadic chaining - there's a reason nobody copies them but everyone got LINQ envy). But it's clear from a perusal of code in the areas where Python dominates, data science and machine learning, that aesthetics are very far from people's minds. They'd be using Javascript if it had the libraries available.
What were their choices though, Perl? It's easy to see why Perl lost out. Other than PHP, I don't really know of any other JIT scripting languages they could have chosen.
There’s not even one obvious flavour of Python to use.
It looks too late for 3.13[0]
Maybe they can channel the BDFL in the Packaging[1] thread for version Pi.
[0] https://docs.python.org/3.13/whatsnew/3.13.html
[1] https://discuss.python.org/c/packaging/14
Is that true? My understanding was that it was a scripting language first (still is), but then got taken up by data and science people and various other niches. Then some education books and courses, like the Artificial Intelligence: A Modern Approach and later MIT and other university adopters. And all of that began to snowball where Python was either the only or main language people knew, so they started using it for everything indiscriminately.
what general purpose? that's just a buzz word. especially back then shipping python apps was never a viable option compared to binaries compiled from c/++ or java. there never was such a "general purpose".
What I understand is that Python was popular in the anglosphere as a general purpose language first, Ruby was somewhat popular in Japan earlier as a general purpose language but didn't become popular in the anglosphere until Rails.
In what way is numpy readable?
Deleted Comment
The reason why one language is more used then others at any given times it's way simpler and more bound to humans than the languages them self: - fashion trendes - laziness - sloth
Most of the people out there writing code and "increasing numbers for any given language" have no real idea of why they started with one language rather then some other one, they never really dig deep enough to actually made an informed choice, and most will keep using a single programming language because they "don't feel the need to learn a new one", aka: I'm too lazy to ever go deep enough the only language I know, let alone learning a new one. And it's the market's fault: we spent the last decade or more taunting how many bagilions programmers will be needed, how anyone can get a great life by simply learning a bit how to code, etc. None gave a fuck about quality, the only goal being cheapening and cheapening the Software Developer profession, until neural networks came about and indirectly revealed the truth: we haven't being rising SW developers/engineers/etc, most of them were just Code Typist copying out of stack overflow. If something like copilot or chatGPT can substitute them, it means there wasn't much value there in the 1st place. In 2007, Jeff Atwood made the quote that was popularly referred to as Atwood's Law: “Any application that can be written in JavaScript, will eventually be written in JavaScript.”, and that's NOT a good thing, it's just the epitome of the state of the industry.
In python's case it's luck was google: python (like go, for instance) is a convenient language for system automations, let's say a more sane versions of what perl was mostly used for in the past (if you notice, lots of python Zen's ideas are attempts to fix perl's insanity). Google has lots of system engineering going on, lots of people using (and abusing) python, and a single repo where everything ends up into, and when they started making neural networks with them, python got fashion for making neural networks. Anyone and their dog wanting to try out some kind of machine learning (10+ years ago) would find a tutorial in python, and tensorflow sealed the deal.
Yes, numpy and pandas did have quite a bit of weight into luring the Math Community into using python, but there's nothing inherent in python that makes them possible, they could have being made in any other language. For instance haskel and lisp are way more approachable from a math stand point, they're just not in fashion any more
The difference is that Python doesn't entirely suck as a general-purpose language. Sure, you might have better options, but it's still reasonable to write almost anything in Python.
Other scripting languages like Ruby, JS and Lua are probably a little bit better when evaluated strictly on their merits as programming languages, but they lack the ecosystem.
In other words, it might be inelegant, slow, a mess, whatever, but the concept behind it is basically correct.
I'd immediately throw out "competitors" that cost hundreds of dollars to use. https://www.mathworks.com/pricing-licensing.html Academic pricing is $275/yr and normal is almost a thousand a year. Then you pay for modules on top.
I haven't seen anything an "A-type" data scientist needs in R/SAS/SPSS that was intrinsic to the language and couldn't be ported to Python. I don't want a "data/scientific" language, or a "web" language, or a "UI" language- I want one language that explicitly supports all the usecases well-enough.
Same goes for R/SASS
Python has no competitor in terms of the whole package, it is a full featured language while being particularly good at data
I've done a fair bit of R and there is no doubt that as a pure data science tool it's usually quicker to produce something than Python.
But I prefer Python because inevitably any data science project ends up having other "bits" that aren't pure data science, and Python's super string general purpose computing libraries are hard to beat.
I always get caught off guard by comments like these. In my mind Python doesn't at all suck as a general purpose language. Only real argument is execution speed, but most people aren't actually writing code where Python's "slowness" matters and if you actually need that speed you can write that part in C and still use it from your Python application.
Comparing python to R, python feels like a much more mature language. R always seemed only useful for small scale projects, e.g. writing some script to process data with < 20 lines of code. Python codebases just scale much better.
I very much disagree and I would say it's the opposite. The only competitor for data-scientists is R, especially if you're doing stats-heavy analysis (opposed to an ML-heavy analysis). SSPS is in my experience used mainly by someone without a data-scientist background, e.g. psychology, similarly to matlab, where the only users I know are engineers.
I would bet that the overwhelming amount of data-scientists use python, even if it's just to set up the data-pipeline. It's just the better general programming language and data-scientist are not analysing all the time but have to get the data in shape and available first.
Dead Comment
Ruby was primarily maintained in Japanese, so had a barrier to entry for language level issues. It also lacked english-language evangelists and university presence.
When Ruby was new (invented 1995) Python had some older design issues (as it was 6 years older) however it really recovered and implemented a lot of change through Python 2 (2000) and python 3 (2008). Though there were compatability issues in the short term in the long term this change worked out.
Ruby inherited from perl TIMTOWTDI (there is more than one way to do it) philosophy which is a little more at odds with the scientific community
It's at odds with humanity, TBH.
I once spent like two days trying to figure out how a small (couple hundred lines) ruby script worked, because it overrided `method_missing`.
It also helped that the PHP ecosystem had some pretty solid and battle tested HTTP components and pretty productive frameworks--server-side rendering with Varnish was as fast as a CDN-backed site feels today.
Yeah it was a success 20 years in the making, had a fair amount of input from science men and that being GvR’s background he responded positively e.g. the extended slices (striding) and ability to index with an unparenthesised tuple (multidimensional indexing) were very much added for the convenience of numpy’s ancestors.
I would not say it had no competition, Perl used to be quite popular in at least some sci-comp fields (I want to say bioinformatics but I’m probably wrong).
That also happened pretty early. Numpy came out in the 90s. But even before that, Python and Perl were about the only two scripting languages other than the shell that was guaranteed to be present on a GNU/Linux system from the start. That made it really popular to create high-level system utilities in Python. A whole lot of the util-linux packages, libvirt, the apt packaging system, all make heavy use of Python. So it's not just the academics already familiar with it, but system administrators and hackers, too.
It also gained widespread popularity as a teaching language alternative to Java. Once MIT started using it in their intro to programming course and put that on the Internet via OCW, it really took off as many people's first exposure to programming.
The batteries included approach to the standard library makes it very usable for one-off automation tasks, too. I don't even like Python that much, but the other day I was doing some math work and just needed to compute a bunch of binomials and enumerate subset combinations. You want to look up how to do that in your language of choice? Python just has math.factorial and itertools.combinations in the standard library. If you're using Linux or Mac, you already have it. It may not be a great choice for application development, but if you need to do some quick, interactive tasks that are never going to get deployed or even necessarily stored on disk as files but just run once from the repl, and it's too much for your calculator app or Excel to handle, Python is perfect.
Also python is less verbose(compared to Java, C#, Scala, JavaScript pre-ES6 etc.) in terms of syntax, so it is much easier to learn and can do both OOP and Functional, while being easier to adapt(no compiler, easy installation, lots of popular libraries), hence academia picked it up. Also, python has built-in parallel processing features which are less daunting(in terms of concepts and syntax) than other competitors(C++/Java etc).
See the popularity of Go, same thing happened with Python.
I would contest the Ruby as web dev part, though. I used Ruby for Puppet and Vagrant a long time ago and it was great, but all my experiences with Rails (and associated gems) turned into maintenance nightmares.
Well, there's R.
But basically, yes that's the long and short of it.
Also, MatLab was the go-to before Python took over the scene. So it definetely has competitors.
Python's strength is that it is the "jack of all trades" language. Its not the best at anything but its pretty good at everything.
It was fun.
From the early days, it had a strong leaning to science, after all that's where it came from. But at the same time it also had a strong stance in unix-areas and established itself as a good tool for sysadmins, competing with perl and bash. And at the same time it also had strong bindings with other languages, gave it early on the fame of a glue-language. It also dabbled early in Web-Stack and networks and had to many web-stack-option to gain that "one framework"-fame, like ruby on rails.
Python just was the language which allowed you to play on on all the important fields, with simplicity, even with some level of speed. This was something other languages lacked. They either were complicated, or slow, or lacking support for specific areas. Python just somehow acquired them all, and had a big impact on a broad area, which made it so popular, because for the majority of usecases it was a natural option.
Python is, imho, not a very good language. But it has a few great libraries which led to overwhelming momentum and inertia. In a cruel twist of irony those libraries are probably written in C.
Because of the library support (especially for data) it's also popular as api servers. It's not as popular for entire applications but is still very active.
EDIT: Django is actually about equal or more popular then nodejs based on google trends. Not sure that means anything as idk how many people google "nodejs" when working on it.
There's a big sub-continent of data stuff in the Python ecosystem, but Python the language hasn't specialized there I'd argue - it has enabled the 3rd party number crunching and data libraries like NumPy and Pandas, but not specialized over other domains where Python is used.
https://stratoflow.com/efficient-and-environment-friendly-pr...
Where Python broke in was machine learning which is not analytical work and often involves lots of general purpose programming. sklearn, skimage, opencv, Tensorflow, Torch, JAX, and so on are often used in a general code base. Torch was actually a C++/Lua framework initially before switching to C++/Python.
Python has a dominant general purpose eco-system. It’s also a simple language to learn compared to R/Matlab/etc which are just horrible to use for data structures or paradigms other than vector/matrix based pipelines.
* Udacity's popularity. It's first few free courses were Python. I believe they advertised as a way to get into Google. Not sure how many people got into Google via Udacity though.
* Leet Code Job Interviews. Maybe I'm way off here as I don't do these but from what I've read people who spend a lot of time on these prefer Python because the syntax & libraries often give a speed & simplicity advantage.
* Lots of devs had to install Python to use some utility or tool someone else created.
Overall I think Python is the best language for the non-professional programmer who wants to script things and put their skills on wheels.
One reason why I switched our team from Matlab to Python was the capabilities of the core libraries. Argparse alone sold it for us.
However aside from that R is vastly inferior.
> When you're writing working code nearly as fast as you can type and your misstep rate is near zero, it generally means you've achieved mastery of the language. But that didn't make sense, because it was still day one and I was regularly pausing to look up new language and library features!
> This was my first clue that, in Python, I was actually dealing with an exceptionally good design. Most languages have so much friction and awkwardness built into their design that you learn most of their feature set long before your misstep rate drops anywhere near zero. Python was the first general-purpose language I'd ever used that reversed this process.
[1] https://www.linuxjournal.com/article/3882
People decided to wait on doing anything big, since Perl 6 wouldn't be quite compatible. Later it turned out to be extremely incompatible. And Perl 5 would be dead long term, so why spend time on it, when you'd have to do a radical rewrite soon?
Meanwhile, Python was there.
And Perl 6/Raku took 15 years to finally arrive in some form, by which point Python had completely eaten its lunch.
I started to teach myself Python just before 2000.
The general consensus before then was that for a first language Perl or Python would be a good choice. Python was typically preferred because it was more approachable than Perl i.e more succinct.
With no prior programming experience (besides manually typing out programs into my C64 as a kid) I managed to build a program that connected to my webpage hosted by my ISP and changed a link in a page to point to a web-server I was running on my PC by just referring to the standard documentation.
That’s one point that doesn’t get mentioned I think. The standard documentation was clear and accessible in a variety of formats. Also Python on Windows was very easy to set up.
My dim memory of that time is that Python code was often longer than Perl, but that was OK because its selling-point was that it didn't have so many syntax edge-cases and it was easier to read. More things were actual words-for-humans as opposed to special symbols one had to memorize and apply according to special rules.
Part of my question is why Python was used as a successor to Perl and not Ruby, but it sounds like Python was established as a competitive scripting language 5 years before people were talking about Ruby.
I can’t find it now, but there was another comment that pointed out that the languages weren’t contemporaries.
This really jives with my experience. I learned Ruby first and used it as my scripting language of choice for years, but I constantly found myself having to lookup how to do even the most basic tasks.
A day or two after learning Python I was already as productive as I had been with Ruby, it was wild.
This isn't so much a jab at Ruby, just that Python worked the way I thought a programming language should, which made everything easy.
Meanwhile Python always felt like an ill-fitting suit to me. I just personally didn't connect with the language design.
But, to each their own!
Frankly I think language design is secondary to IDE integration. The only time I've ever coded without thinking at all was using Java with IntelliJ or C# with Visual Studio. Type-matching and library introspection all work flawlessly 99% of the time. Not only do I not need to know the details of syntax, but I didn't even need to know APIs because I could autocomplete everything all the time. Python with VS Code still feels stone age by comparison.
I changed companies and learned Python because everyone was raving about it on the Web. I was so confused when I started using it, I thought I had a misconfigured environment or that I was missing some libraries when I started writing Python.
Well, it isn't intuitive now. It was a lot better than Perl back in 2002 or so, when it really started taking off.
Now, a lot of serious Python code is simply incomprehensible.
I still have to use python though. I make an effort to type-annotate everything I see, and python's type annotation features keep improving, but it still often feels like an awkward uphill battle
The situation has changed now that having at least somewhat reasonable IDE integration is common. AI tools can also give you pretty good autocomplete.
Python was popular as a teaching language, how it gained a foothold in academia.
I remember 20 years ago when I was starting out it was popular for game scripting (Lua was more embeddable but python was also used). plenty of first class interfacing methods to native code (still remember c++ boost shipping python binding generator).
Many tools shipped python as an embedded scripting language.
Python was used for creating build systems.
Not to mention any distro out there ships python, it replaced perl in that regard. Batteries included and being available helped it become the norm for random scripting.
Python was big before numpy/ML.
Nothing against Plone (although I was very happy to put its foundation layer, Zope, in my rear view mirror). It's a fine program that's very good at what it does. But just because something's been around a while doesn't mean that it's still vibrant and growing.
Now it's probably the third option, when building websites... and a fourth option, when building APIs.
Python was all over the place when RoR came out, meanwhile I only remember hearing about Ruby on random forum posts or by some enthusiast before RoR.
Outside of this, it's been perl and python, with python replacing perl after that whole perl 6 fiasco while the language and distros started understanding packaging python.
Deleted Comment
Chef you obviously had to, but my impression is that the DSL is so extensive you might as well be writing something that's not Ruby.
Maybe this says something about the power of Ruby, but I don't feel like these tools contributed to the language's popularity or ecosystem - definitely not the way RoR did.
Dead Comment
I had dabbled with Python before encountering it (in its embedded form), but once I started using it inside ESRI's ArcGIS software, it became my go to language for many, many tasks.
Then in about 2005 I heard about zope, and how easy and wonderful it was compared to php. But it was slow and memory heavy.
In about 2008 more and more stuff started getting python bindings, specifically in VFX (where I was working). There was a tussle between javascript and python, and python won out, even displacing MEL and other domain specific languages.
Ruby was always more obscure to me, it was for people that likes C and wanted a more perl-y version of it, but without the anachy or the universality of CPAN.
I think the biggest reason was that it had loads of stuff in the standard lib, way more than anything else at the time. Much more than perl, so it meant that things were a bit more "portable". So people came for the stdlib, then started adding things to it because they'd spent all that time learning it.
Because of this, lots of people learned Python, and then applied it in many different areas, and it just became more prevalent and useful.
I think the other part of Python's success compared to JS is a better story about C integration. Python itself is just a series of macros to make C nicer to use, which meant the existing C/C++ ecosystem was relatively easy to integrate. And then the sci-py world also brought in Fortran. JS is huge on the backend web, but it hasn't spread into other ecosystems nearly as much because the C integration story isn't as good.
Python basically solved this problem by wrapping around C code that's far more performant, but still exceptionally simple to leverage due to python's simple syntax. LlamaCpp has C++ in its name, yet its most popular platform is python. So for certain applications, python because the undisputed #1. Because it was C in disguise, with better usability.
With parallelization, Python having 100x slower loops became an old problem. 2023's coding paradigm is "if you're using loops, you're doing it wrong." People love complaining about pandas, but Dask solved all the single-core problems in 2021.
> https://twitter.com/iammemeloper/status/1692901551558320254
Amen
I would rather call it a work-around. Having to switch to a vastly different language to get halfway decent performance is hardly a good solution.
It's not a rare thing in normal applications that you need to have some routine fast and there's no pre-built C solution for it.
Pointing out any problem where speed is the difference between working and unusable—e.g. ray tracing—is probably too cheap a shot, as it’s not like Ruby or Perl are any better in that regard, and LuaJIT came too late to be relevant to the current question.
One range of problems where I’ve found Python (unextended by a code generator) to be surprisingly awful, though, is everywhere the ML style of algebraic datatypes + pattern matching is natural: compilers, basically, all the way down to a regex engine. There’s just no comparison.
Maybe things have changed now that `match` exists? I’ve not yet had the time to try, even if the double indentation doesn’t make me hopeful.
And when you get older, and you care more about solving problems than about trying new things, and you've got more responsibilities in life and lack the time to devote to learning new language / technology du jour, then knowing and using Python becomes so handy.
If it is indeed the 2nd best language for the job, it is still a decent choice if you get to solve the problem.
many accounts of people dropping other languages with better perf or semantics to use python as a prototyping++ tool
it was a trend in many dynlangs (php and others) where you'd write the core in it and drop to C for hot loops
python is better than php, it has some metaprogramming (hello CLOS) to mold it syntactically wise, good enough stdlib, low enough semantic warts ..
than and numpy/pandas for non sweng crowds
The "second best language" is false, it's just usable for a wide variety of tasks. Just like many other languages, but specifically not Ruby.
At the time this was happening, neither Ruby nor JavaScript were credible alternatives, Python won by default. The only other scripting language I can remember being used in similar contexts at the time was Tcl.
The other half of it is that Python had a great C binding story, which made it easy to integrate Python with almost everything since C ABI was the lingua franca of interoperability. You could wrap high-performance code in a thin layer that would allow you to efficiently script it in Python. This is why Python became ubiquitous in HPC/supercomputing decades ago. It allowed Python to have a vast library of functionality with good performance and minimal effort by leaning on the vast number of C libraries that already existed to implement that functionality.
Another underappreciated advantage of Python, a bit like C++, is that it is promiscuous language. It isn’t so in love with its purity that it won’t happily do slightly dodgy things to hook up with things that were never intended to work together. It may not be pretty but you can actually make it work and the language doesn’t try to prevent it.
As time went on, a school of thought emerged that you could efficiently do almost anything you needed to do in software engineering with a combination of Python and C++. There is definitely an element of truth to that, and the ease with which you can combine those languages made them common bedfellows in practice.
Python initially won because it was better for writing complex software than Perl, but its staying power was based on its easy integration with literally everything.
This is a really excellent point. We were a Ruby/Golang shop and out of nowhere one of the sales engineers starts writing scripts in Perl. He wrote some really useful stuff, but the downside was that it was incomprehensible gobbldygook baked into 4 or 5 lines, and completely unmaintainable by anyone other than the original author. After he left we decided to rewrite all his code in Python/Ruby based on the general idea of what we wanted the inputs and outputs to be, rather than assign anyone to maintain the actual Perl code.
Vs the 8-year-old analytics/data science codebase I had to unwind a couple years back, was written entirely in Python and was easily understandable from the first line and had been maintained by no less than six people over that time
And in for Python there is the 13th rule of the Tao of Python: "There should be one-- and preferably only one --obvious way to do it."
The language is built around pushing you to do things a certain way, it doesn't force you, but the correct way is usually the easiest and cleanest way.
Perl would probably still have a place at the table if Perl 6 hadn’t turned into a debacle over an astonishing number of years.
I would argue that Python dominates Ruby in this metric.
New users wonder how to call functions. They form an intuition ("use parenthesis"), but it's unreliable. "Oh, parenthesis are optional--oh, parenthesis are only optional sometimes".
New users wonder what a function is exactly. They form an intuition, but their initial intuition doesn't encompass all 7 different types of callables in Ruby, so many surprises and frustrations await.
Python is much more boring in this respect, users are more likely to form accurate intuitions.
What follows is my subjective experience as I came to like Python but hate Ruby: I learned Python in the mid 2000's while trying to script Asterisk. Asterisk had an existing community around PHP and Ruby, so I looked at Python, PHP, and Ruby with fresh eyes, Python appealed to me most. I remember being very confused about what Ruby "gems" were, I thought they were something like "JavaEE" that I'd heard about in passing at school, some kind of compiler plugins or something complicated. Python had "libraries" though, I knew what they were. I didn't seek out learning materials for the languages, I only saw how people were talking about them in the Asterisk community. I'm sure the right material would have explained what a "gem" was very well, but those obscure mailing lists I was reading did not. I never did give Ruby a fair shake, so don't take this as advice, these were merely my experiences as a new programmer looking at both languages with fresh eyes.
I can say that personally using Ruby, Python felt like a massive step backward. I can see how others might disagree, but I feel like it's all in what you know first. On a certain level, most of us know JavaScript is just terrible, but there have been millions of new devs who knew nothing else and thought it's fine--better than fine, its great, it's the other stuff that's weird! But then you go on for a while, maybe eventually find lisp and/or functional programming, and you realize how brain-damaged our most commonly used tools really are.
And you'd be wrong.
I can confidently say that since I fully expecteded python to win over ruby and it did. Every time I used to hear hipsters dev being all the rage about ruby, I knew they implicitly discounted the cognitive load that goes with learning ruby.
The path from pseudo code to working python code was ( is? ) straight forward and ruby doesn't bring anything in term of paradigm over python that justifies foregoing that advantage.
So everytime a bright mind wanted to implement a library in her field of expertise, python was her tool of choice. And that's how python conquered field after field.
I'd agree with that. I've done a lot of work in Python and Ruby. I mostly do Python these days. I think Ruby's ergonomics were better, especially for novices. (E.g., if you want to know what methods you can call on an object, in Ruby you can just look at the object, but in Python there are a whole zoo of global functions you're supposed to know about. [1]) But I think ergonomics just don't matter much when compared with more practical considerations like availability of libraries or number of developers available to hire.
[1] https://docs.python.org/3/library/functions.html
By 2005 or so Python was already the #1 scripting language (other than PHP, I guess).
I feel the Ruby community is very syntax oriented. As evidence of this:
I see Ruby developers interested in Elixir and Crystal, languages that are syntactically similar, but technically very different.
I do not see Ruby developers interested in Python, even though, if we ignore syntax, Python and Ruby mine as well be the same language. Technically speaking, and in the grand scope of all languages, Ruby and Python are very very similar.
Same things are happening with Python. And if I have to be honest, if I come from a different profession and want to change my career, I would pick either Python or Javascript for my first step.
print f"are you {sure} you need parenthesis to call a function in python"
>Python is much more boring in this respect, users are more likely to form accurate intuitions.
is defining a class the same as defining a function? What about functions implicitly defined when you define a class?
__what__ is __with__ __some_words__?
why do you need an empty file called init.py in the same directory as your actual code?
This example doesn't call any functions. The print statement was removed in Python 3 and turned into a function, so you do need parentheses to call it and the example above is a syntax error. Python 2 (which had a print statement instead of a print function) didn't support f-strings. And f-strings, unlike JavaScript's template strings, are not function calls.
In other words, yes, you consistently use parentheses to call functions in Python.
print as a statement is inconsistent and surprising, but the explanation is shallow--they made an exception for print, that's all there is to it. It's not beautiful but it is unlikely to cause a 2 hour debugging session.
To begin with I could never understand why there need to be those global functions in Python to do meaningful things with lists.
In Ruby the "everything is an object" really works down to every nut and bolt and is clean and very conceptually pleasing.
Whether you call a package a gem, module, jar or whatever isn't very central.
The optional parentheses example is a good one. That mistake was carried over to Crystal, it's such a silly little inconsistency that I'm sure the designers considered "cute" or "clever". It's naught but a constant pain in the butt.
Another good example is YAML vs JSON. I understood the entirety of JSON syntax pretty much immediately upon seeing it, it's trivial. YAML is supposed to be "easier for humans", but by forgoing a uniform and consistent syntax it ended up an order of magnitude harder.
[1]: https://news.ycombinator.com/item?id=36396256
My memory is that, at least outside of the web/RoR world, Ruby at no point as popular in the US as Python. Lets take, say 2005 as a point that Ruby was really gaining popularity due to Rails: at this point I was already seeing internal tools that previously would have been written in Perl being written in Python.
Python just seems to have maintained a certain momentum, and I'm not sure why, but if I had to guess:
1. The syntax is almost a dead-ringer for pseudo code. The first time I encountered Python, I fixed a bug in an open-source project having never seen Python before. I recall two different CS programs switched to Python from Scheme for exactly this reason.
2. It was designed to glue C together. Other languages were too (TCL and Lua; Perl primarily interfaced with the outside world in the same way shell does, with pipes).
3. A kitchen-sink approach to the standard library helped it a lot for the first 20 years of its existence. CPAN was sui-generis at the time for installing libraries, so the more you could do without relying on 3rd-party libraries, better off you were.
1: https://en.wikipedia.org/wiki/Eggdrop
You could kind of write Java in Python, and thus a certain kind of enterprise techie couldn’t dismiss Python as “just” a scripting language.
Every Perl code base I've seen, or inherited, has been a mess. My favourite was a bunch of utilities all written by the same person who never did anything the same way twice. Each utility was like an entry in an obfuscated 'C' contest.
I used Perl to build my first web application at LONG time ago. That project is included in the list of Perl code bases I've seen that were a mess :-)
Perl like VisualBasic had a well deserved reputation for facilitating atrocious hard to debug code, and it was this reputation more then the lack of OO/XML support that led both of those to fade back into the realm of forgotten memory and business critical enterprise applications that nobody dares to touch.
It did take a while for the wider Perl community to embrace OO, though. There were a lot of Perl programmers who weren't into lofty paradigms. Plenty of Perl programmers weren't fully sold on the idea of breaking your program into functions.
There were so many "it can't possibly be that easy, but it is!" moments.
Let's write a function to add five to a number:
OK. So, can I pass that function as an argument to another function? What? That worked?! And there was zero additional syntax, you just... do it?After a seemingly endless list of happy discoveries like that, I seriously rethought my idea of what programming languages could be, and ought to be.
I loved Perl, but it had a culture of doing things in N different ways, too many shortcuts, and you ended up with incredibly tiny scripts which were unreadable -- even to the author -- a month later. This wasnt practical for business use in the face of more consistent code from Python
This is the big one. You look at it, and you think you know what it's going to do. The indentation rules also means things look predictable rather than being a mess of brackets.
I wanted a language that I would come back to and still understand what I was trying to do. Ruby looked like its spirit animal was perl.
And then lots of things were built on top of Numpy - image processing, reading GIS raster data, scipy, pandas, etc etc - and they're all trivial to combine because it's all just Numpy arrays.
Python also had a very friendly and approachable community from the start, and great docs, while Ruby had its documentation in Japanese only for some time.
Django is also top quality, imo. Similar to Rails. But it always seemed Ruby was only good for Rails, whereas Python combines with everything.
So as usual, it's not about the language, it's about the rest.
However, the significance of both declined with the expansion of high-tech options (Spring, Go, Scala, Node.js) and the expansion of low-tech options (Firebase).
So today the popularity of the backend frameworks of each language are less important.
Maybe it used to, but seems a lot more people at least searches for Django pretty much everywhere except US, Canda and Japan.
https://trends.google.com/trends/explore?date=all&q=%2Fm%2F0...