Readit News logoReadit News
fitzn commented on Anthropic acquires Bun   bun.com/blog/bun-joins-an... · Posted by u/ryanvogel
kentonv · 16 days ago
I built Cloudflare Workers?
fitzn · 16 days ago
Boom
fitzn commented on OpenZL: An open source format-aware compression framework   engineering.fb.com/2025/1... · Posted by u/terrelln
fitzn · 2 months ago
Non-Linear Compression! We had a tiny idea back in the day in this space but never got too far with it (https://www.usenix.org/conference/hotstorage12/workshop-prog...).

I am pumped to see this. Thanks for sharing.

fitzn commented on Cap'n Web: a new RPC system for browsers and web servers   blog.cloudflare.com/capnw... · Posted by u/jgrahamc
kentonv · 3 months ago
To chain three calls, the client will send three messages, yes. (At least when using the WebSocket transport. With the HTTP batch transport, the entire batch is concatenated into one HTTP request body.)

But the client can send all three messages back-to-back without waiting for any replies from the server. In terms of network communications, it's effectively the same as sending one message.

fitzn · 3 months ago
Yep - agreed. Thanks!
fitzn commented on Cap'n Web: a new RPC system for browsers and web servers   blog.cloudflare.com/capnw... · Posted by u/jgrahamc
fitzn · 3 months ago
Just making sure I understand the "one round trip" point. If the client has chained 3 calls together, that still requires 3 messages sent from the client to the server. Correct?

That is, the client is not packaging up all its logic and sending a single blob that describes the fully-chained logic to the server on its initial request. Right?

When I first read it, I was thinking it meant 1 client message and 1 server response. But I think "one round trip" more or less message "1 server message in response to potentially many client messages". That's a fair use of "1 RTT", but took me a moment to understand.

Just to make that distinction clear from a different angle, suppose the client were _really_ _really_ slow and it did not send the second promise message to the server until AFTER the server had computed the result for promise1. Would the server have already responded to the client with the result? That would be a way to incur multiple RTTs, albeit the application wouldn't care since it's bottlenecked by the client CPU, not the network in this case.

I realize this is unlikely. I'm just using it to elucidate the system-level guarantee for my understanding.

As always, thanks for sharing this, Kenton!

fitzn commented on From unit tests to whole universe tests (with will wilson of antithesis) [video]   youtube.com/watch?v=_xJ4m... · Posted by u/zdw
narsa123 · 3 months ago
Any tools we could use to test mobile apps automation testing using AI (Like MCPs for mobile app testing)??
fitzn · 3 months ago
Reflect tests mobile apps by converting plain text instructions into appium commands at runtime using AI. Your tests are just the text steps.

https://reflect.run/mobile-testing/

disclaimer: I co-founded Reflect.

fitzn commented on Rocketships and Slingshots   postround.substack.com/p/... · Posted by u/juecd
fitzn · 3 months ago
It would have been cool to read stories about a few more examples of slingshots, as the author calls them.
fitzn commented on You don't want to hire "the best engineers"   otherbranch.com/shared/bl... · Posted by u/rachofsunshine
fitzn · 4 months ago
> The best engineers make more than your entire payroll. They have opinions on tech debt and timelines. They have remote jobs, if they want them. They don’t go “oh, well, this is your third company, so I guess I’ll defer to you on all product decisions”. They care about comp, a trait you consider disqualifying. They can care about work-life balance, because they’re not desperate enough to feel the need not to. And however successful your company has been so far, they have other options they like better.

Yep

u/fitzn

KarmaCake day240March 31, 2020
About
VP, AI and Architecture at SmartBear (smartbear.com)

Co-Founder at Reflect S20 (https://reflect.run)

View Original