Readit News logoReadit News
agrnet commented on How AI hears accents: An audible visualization of accent clusters   accent-explorer.boldvoice... · Posted by u/ilyausorov
bikeshaving · 2 months ago
The source code for this is unminified and very readable if you’re one of the rare few who has interesting latent spaces to visualize.

https://accent-explorer.boldvoice.com/script.js?v=5

agrnet · 2 months ago
could you explain what it means for someone to “have interesting latent spaces”? curious how you’re using that metaphor here
agrnet commented on QA-use-MCP: MCP for E2E testing   npmjs.com/package/@desple... · Posted by u/tarasyarema
tarasyarema · 2 months ago
As I mentioned above, a playwright won’t make the cut for many of the serious test cases we’ve seen, you need a whole system that ensures your tests are run and improved immediately. We created this project in a way that supports on-premise deployments, but you’ll need to run the whole engine and eventually use some SLMs/LLMs at different stages.
agrnet · 2 months ago
At the end of the day, is the LLM not just calling Playwright APIs? I’d rather have access to the final set of Playwright API steps that the LLM executed to accomplish a goal, rather than just hoping the LLM will choose the same actions again the second time i run it
agrnet commented on QA-use-MCP: MCP for E2E testing   npmjs.com/package/@desple... · Posted by u/tarasyarema
agrnet · 2 months ago
Atleast in my industry (highly regulated), I think it would be better if these agentic e2e tools output playwright code instead of keeping it all under the hood, as no risk averse regulated company will use a QA agent which could be nondeterministic when re running the same test
agrnet commented on There is a huge pool of exceptional junior engineers   workweave.dev/blog/hiring... · Posted by u/mooreds
carabiner · 3 months ago
Ok?
agrnet · 3 months ago
Ok!
agrnet commented on What is HDR, anyway?   lux.camera/what-is-hdr/... · Posted by u/_kush
Terr_ · 7 months ago
> Our eyes can see both just fine.

This gets to a gaming rant of mine: Our natural vision can handle these things because our eyes scan sections of the scene with constant adjustment (light-level, focus) while our brain is compositing it together into what feels like a single moment.

However certain effects in games (i.e. "HDR" and Depth of Field) instead reduce the fidelity of the experience. These features limp along only while our gaze is aimed at the exact spot the software expects. If you glance anywhere else around the scene, you instead percieve an unrealistically wrong coloration or blur that frustratingly persists no matter how much you squint. These problems will remain until gaze-tracking support becomes standard.

So ultimately these features reduce the realism of the experience. They make it less like being there and more like you're watching a second-hand movie recorded on flawed video-cameras. This distinction is even clearer if you consider cases where "film grain" is added.

agrnet · 7 months ago
This is why I always turn off these settings immediately when I turn on any video game for the first time. I could never put my finger on why I didn’t like it, but the camera analogy is perfect

u/agrnet

KarmaCake day7May 15, 2025View Original