Readit News logoReadit News
Twey commented on Response to "Ruby Is Not a Serious Programming Language"   robbyonrails.com/articles... · Posted by u/robbyrussell
falcor84 · 15 days ago
COBOL is used for the longest continuously running systems, particularly in the financial industry, many of which have been in production since the 1960s - mainly on mainframes, but noawadays also increasingly in cloud environments. There's nothing that comes close to the level of reliability that these offer, with the next closest probably being Fortran, Ada and Erlang.

I found this article with some numbers [0], with the top one being that "95% of ATM swipes rely on COBOL code". If you just need to maintain something in production, and only occasionally update the business logic, without having to upgrade the architecture, COBOL is the way to go.

[0] https://www.pragmaticcoders.com/resources/legacy-code-stats

Twey · 14 days ago
I think the correlation here is pretty solid but I wonder about the causality. There are a few big confounding variables; off the top of my head,

1. COBOL systems are typically written on a much shallower software stack with less room for unreliability

2. banking systems have a ton of effort put into reliability over decades of development.

Twey commented on Freer Monads, More Extensible Effects (2015) [pdf]   okmij.org/ftp/Haskell/ext... · Posted by u/todsacerdoti
thesz · 25 days ago
What is language feature in some language is a library in Haskell.
Twey · 25 days ago
Arguably an effect monad is an EDSL that has algebraic effects :)

But the things these languages are experimenting with are low-level implementation details that wouldn't be amenable to embedding. There's no escaping the Haskell GC.

Twey commented on Freer Monads, More Extensible Effects (2015) [pdf]   okmij.org/ftp/Haskell/ext... · Posted by u/todsacerdoti
throwthrow0987 · 25 days ago
Effect systems are a trend that will go away. You can't statically guarantee that only, for example, the DB has side effects in a function. So what's the point? Haskell got it right in the first instance: IO or pure.
Twey · 25 days ago
The point of effect systems isn't to stratify the behaviour of the operating system (it's the Wild West out there). It's to stratify the behaviour of your program. A function that has a DB effect isn't telling you that it will make Postgres queries (which can do anything!), it's telling you that it wants to make DB queries so you need to pass it a handler of a certain form to let it do that, hexagonal architecture style.

But you can also stratify in the other direction. ‘Pure’ functions aren't real outside of mathematics: every function can have side effects like allocating memory, possibly not terminating, internal (‘benevolent’) mutation, et cetera. When we talk about ‘pure’ functions we usually mean that they have only a particular set of effects that the language designer considered ‘safe enough’, where ‘enough’ is usually defined with reference to the ergonomic impact of making that effect explicit. Algebraic effects make effects (and importantly effect composition — we got here in the first place because we were fed up of monad transformers) more ergonomic to use, which means you can make more effects explicit without annoying your users.

Twey commented on Freer Monads, More Extensible Effects (2015) [pdf]   okmij.org/ftp/Haskell/ext... · Posted by u/todsacerdoti
kccqzy · a month ago
If you are looking for real-world code for an effect system, not just a PDF paper, you should probably look at the eff library: https://github.com/hasura/eff

The acknowledgement section on that GitHub README mentions this paper.

Twey · 25 days ago
As far as I know the shiniest implementations in the effect typing world at the moment are Koka and Effekt, which are both languages in their own right. They each have their own ideas about implementation to make effects (mostly) zero-cost.

https://koka-lang.github.io/https://effekt-lang.org/

Frank is pretty old now but perhaps a simpler implementation: https://github.com/frank-lang/frank

Deleted Comment

Twey commented on Think in math, write in code (2019)   jmeiners.com/think-in-mat... · Posted by u/alabhyajindal
Twey · a month ago
This article comes across as rather defeatist:

> Another limitation of programming languages is that they are poor abstraction tools

> Programming languages are implementation tools for instructing machines, not thinking tools for expressing ideas

Machine code is an implementation tool for instructing machines (and even then there's a discussion to be had about designing machines with instruction sets that map more neatly to the problems we want to solve with them). Everything we've built on top of that, from assembly on up, is an attempt to bridge the gap from ‘thinking tools for expressing ideas’.

The holy grail of programming languages is a language that seamlessly supports expressing algorithms at any level of abstraction, including or omitting lower-level details as necessary. Are we there yet? Definitely not. But to give up on the entire problem and declare that programming languages are inherently unsuitable for idea expression is really throwing the baby out with the bathwater.

As others in the comments have noted, it's a popular and successful approach to programming today to just start writing code and seeing where the nice structure emerges. The feasibility of that approach is entirely thanks to the increasing ability of programming languages to support top-down programming. If you look at programming practice in the past, when the available implementation languages were much lower-level, software engineering was dominated by high-level algorithm design tools like flowcharts, DRAKON, Nassi–Shneiderman diagrams, or UML, which were then painstakingly compiled by hand (in what was considered purely menial work, especially in the earlier days) into computer instructions. Our modern programming languages, even the ‘low-level’ ones, are already capable of higher levels of abstraction than the ‘high-level’ algorithm design tools of the '50s.

Twey commented on Syntax and Semantics of Programming Languages (1995)   homepage.cs.uiowa.edu/~sl... · Posted by u/nill0
anonymousDan · a month ago
I've always really struggled to understand the purpose of defining the 'semantics' of a programming language and how it differs from syntax. Explanations that involve 'giving a precise mathematical meaning' just seem almost circular to me. As I understand it now it's about saying what the value of a particular language construct should be (e.g. when evaluated), as opposed to whether the construct is allowed/part of the language (syntax). Is that intuition wrong?
Twey · a month ago
In addition to the other comments here, note that in PL circles ‘syntax’ typically denotes _everything_ that happens before translation/execution, importantly including type checking. ‘Semantics’ is then about explaining what happens when the program is run, which can equivalently be described as deciding when two programs are equal, mapping programs to mathematical objects (whose ‘meaning’, or at least equality, is considered to be well understood), specifying a set of transformations the syntax goes through, et cetera.

In pure functional languages saying what value an expression will evaluate to (equivalently, explaining the program as a function of its inputs) is a sufficient explanation of the meaning of a program, and semantics for these languages is roughly considered to be ‘solved’. Open areas of study in semantics tend to be more about doing the same thing for languages that have more complicated effects when run, like imperative state update or non-local control (exceptions, async, concurrency).

There's some overlap in study: typically syntax is trying to reflect semantics in some way, by proving that programs accepted by the syntactic analysis will behave or not behave a certain way when run. E.G. Rust's borrow checker is a syntactic check that the program under scrutiny will not dereference an invalid pointer, even though that's a thing that is possible by Rust's runtime semantics. Compare to Java, which has no syntactic check for this because dereferencing invalid pointers is simply impossible according to the semantics of the JVM.

Twey commented on Is Software the UFOlogy of Engineering Disciplines?   codemanship.wordpress.com... · Posted by u/flail
mamcx · a month ago
The problem is the deps: You can have Rust with no nulls, but the OS was made by it. You can have a RDBMS with ACID, but the file system is liar.

Then, you can make a product that was applied engineering, but you can't replace all the others that not.

Civil engineering and friends have the advantage that is built on top of the universe, that despite claims to the contrary, was not hacked with perl!

Twey · a month ago
Engineers also have this problem: if the transistor doesn't meet its claimed tolerances then the robot won't either.
Twey commented on Is Software the UFOlogy of Engineering Disciplines?   codemanship.wordpress.com... · Posted by u/flail
Twey · a month ago
The ‘three tribes of programming’ [1] strike again!

This thread is full of claims that ‘programming is really engineering’ (in accordance with the article), ‘programming is really building’, or ‘programming is really philosophy/mathematics’. They're all true!

It's not that one of them is the True Nature of software and anyone doing the others is doing it wrong or at a lower level. These are three different things that it is completely reasonable to want to do with software, and each of them can be done to an amateur or expert level. But only one of them is amenable to scientific analysis. (The other two are amenable to market testing and formal proof, respectively.)

[1]: https://josephg.com/blog/3-tribes/

Twey · a month ago
On second thought the tribal testing framework here is a bit simplistic, and there's some cross-tribe pollination, to varying levels of success.

The ‘maker’ tribe also tests with HCI assessments like GOMS and other methods from the ‘soft’ sciences like psychology and sociology, not just economics/business science.

Model-checking and complexity proofs (and complexity type systems) are mathematical attempts to apply mathematician-programmer methods to engineer-programmer properties.

Cyclomatic complexity is an attempt to measure mathematician-programmer properties using engineer-programmer methods.

u/Twey

KarmaCake day338January 30, 2011View Original