Readit News logoReadit News
tonyg commented on Cap'n Web: a new RPC system for browsers and web servers   blog.cloudflare.com/capnw... · Posted by u/jgrahamc
kentonv · 3 months ago
The input to the map function (when it is called in "record" mode on the client) is an RpcPromise for the eventual value. That means you can't actually inspect the value, you can only queue pipelined calls on it. Since you can't inspect the value, you can't do any computation or branching on it. So any computation and branching you do perform must necessarily have the same result every time the function runs, and so can simply be recorded and replayed.

The only catch is your function needs to have no side effects (other than calling RPC methods). There are a lot of systems out there that have similar restrictions.

tonyg · 3 months ago
Is .map specialcased or do user functions accepting callbacks work the same way? Because you could do the Scott-Mogensen thing of #ifTrue:ifFalse: if so, dualizing the control-flow decision making, offering a menu of choices/continuations.

Deleted Comment

tonyg commented on Synthesizing Object-Oriented and Functional Design to Promote Re-Use   cs.brown.edu/~sk/Publicat... · Posted by u/andsoitis
tonyg · 3 months ago
(1998). Java existed, but neither Scala nor Java-with-generics did.

From the conclusion:

"We have presented a programming protocol, Extensible Visitor, that can be used to construct systems with extensible recursive data domains and toolkits. It is a novel combination of the functional and object-oriented programming styles that draws on the strengths of each. The object-oriented style is essential to achieve extensibility along the data dimension, yet tools are organized in a functional fashion, enabling extensibility in the functional dimension. Systems based on the Extensible Visitor can be extended without modification to existing code or recompilation (which is an increasingly important concern)."

tonyg commented on Web apps in a single, portable, self-updating, vanilla HTML file   hyperclay.com/... · Posted by u/pil0u
paulirish · 4 months ago
Turns out, the original TiddlyWiki used a java jar to handle the file persistence. (I remember it being so magically automatic, but recently investigated how it was done)
tonyg · 4 months ago
I don't think that's right - IIRC it used to be possible to write out a file, if loaded from a file:// URL, directly from JavaScript. Then that ability got nobbled because security (justifiable) without properly thinking through a good alternative (not justifiable). I mourn the loss of the ability, TiddlyWiki was in a class of its own and there should have been many more systems inspired by its design. Alas.

ETA: Wikipedia has reminded me the feature was called UniversalXPConnect, and it was a Firefox thing and wasn't cross-browser. It still sucks that it was removed without sensible replacement.

tonyg commented on Please Commit More Blatant Academic Fraud (2021)   jacobbuckman.com/2021-05-... · Posted by u/jxmorris12
tonyg · 10 months ago
> Undermining the credibility of computer science research is the best possible outcome for the field, since the institution in its current form does not deserve the credibility that it has.

Horseshit. This might be true for AI research (and even there that's an awfully broad brush you're using, mate), but it's certainly not true for other areas of computer science.

tonyg commented on Intensional Joy (a concatenative account of internal structure)   pithlessly.github.io/inte... · Posted by u/g0xA52A2A
kazinator · 10 months ago
Static analysis can tell what forms are invoking an fexpr and which are function calls. It's not got different from knowing which are macros. That problem can be solved.

The main problem is that a language with fexprs is inherently not compilable. A second problem is that for some fexprs, compilation semantics cannot be found.

A fexpr definition adds a new special operator to an interpreter. The existence of fexprs means that the repertoire of special operators is open-ended. But a compiler depends on there being a fixed set of special operators known in advance. For each kind of form the compiler has a case, which implements the translation scheme. Someone wrote that translation scheme by understanding what that special form does.

When a compiler hits a form that is a fexpr invocation, there is no translation scheme for that. It is defined by piece of code in the program itself, which gives the interpretive semantics for the form. The compiler would have to read that code, understand it, and come up with a translation scheme for it from interpreted to compiled semantics. In other words do the job of a compiler writer. It requires advanced artificial intelligence.

Some fexprs are not compilable. A compiler writing expert can look at the application code which defines the fexpr, and come to the conclusion that the code doesn't make sense outside of the interpreted world.

Fexprs that can be compiled correspond to those forms which can be written as macros.

A strategy is possible whereby for each fexpr, the application must supply a macro definition, if that application is to be compilable. The interpreter will use the fexpr, and the compiler will instead expand the macro and use that.

But what is the point. You have to maintain two implementations of the same thing. Interpreters can use macros just fine.

Fexprs do have an advantage over macros: lack of hygiene issues. The local variables in a fexpr are clearly in a different lexical environment from the variables of the form that it operates on/with. The fexpr function has the lexical environment of the fexpr form as an argument. When the interpreter invokes a fexpr, it hands the fexpr the current lexical environment, and the fexpr form. Whenever the fexpr code needs to evaluate some part of that form, like a variable reference, it explicitly calls eval, and passes eval that lexical environment. In no way does that get mixed up with the interpretation of the fexpr itself. There can be no capture issue. An fexpr would never need gensyms, or contain a mistake you cannot use them.

Macros also don't have hygiene issues between their own variables and those in the generated code. But fexprs can you use their own variables as runtime temporaries to hold intermediate values needed by the calculation that they are interpreting. Macros cannot use their own variables this way because they are not executing at run time. They have to introduce variables into the generated code. These variables are then in the same lexical environment as that code and must be given unique symbols in order to hide these introduced variables, protecting them from conflicts.

Some newcomers into the Lisp world still become fascinated by fexprs for, I suspect, mainly this reason. The lack of hygiene concerns somehow gives fexprs a kind of dignified air so to speak, like they are clean and fundamental. This view is further bolstered by that any macro could be a fexpr, but the converse isn't true. There's a kind of magic in allowing interpreter code the dynamically extended interpreter in arbitrary ways, seemingly bounded only by the limits of computation.

tonyg · 10 months ago
> Static analysis can tell what forms are invoking an fexpr and which are function calls. It's not got different from knowing which are macros. That problem can be solved.

I don't think this is the case. Consider Kernel's

  ($lambda (f) (f (+ 3 4)))
Is `f` a fexpr or a closure? We cannot know until runtime.

tonyg commented on Preserves: An Expressive Data Language   preserves.dev/... · Posted by u/mpweiher
skybrian · a year ago
Thanks for the clarification! That sounds about as evolvable as JSON or any system that uses string keys (like HTTP headers).

Protobufs have an extra level of indirection built in: code refers to fields using names, but numbers are sent on the wire. Without convenient access to field numbers, they can’t as easily be hard-coded. This also strongly encourages using the schema file for most tasks. With protobufs (or similar), any user-friendly editor will need a schema to make sense of the data.

JSON-like systems and protobufs have opposite design goals: encouraging versus discouraging schemaless data access.

tonyg · a year ago
There are no string keys in the Person example above. You could add some, though, or use numbers instead with the same host-language API:

  Person = <person @name String @address Address>
as above, or

  Person = <person {
    @name "name": String
    @address "address": Address
  }>
or

  Person = {
    @name 1: String
    @address 2: Address
  }
etc. all produce the same host-language record, e.g. in TypeScript

  export type Person = {
    name: String,
    address: Address,
  };

u/tonyg

KarmaCake day3014March 9, 2009
About
I'm Tony Garnock-Jones.

https://leastfixedpoint.com/

mastodon: @tonyg@leastfixedpoint.com

View Original