Readit News logoReadit News
ClawsOnPaws commented on Building a Personal AI Factory   john-rush.com/posts/ai-20... · Posted by u/derek
IncreasePosts · 2 months ago
Okay, what is he actually building with this?

I have a problem where half the times I see people talking about their AI workflow, I can't tell if they are talking about some kind of dream workflow that they have, or something they're actually using productively

ClawsOnPaws · 2 months ago
I keep coming to the same conclusion, which basically is: if I had an LLM write it for me, I just don't care about it. There are 2 projects out of the maybe 50 or so that are LLM generated, and even for those two I cared enough to make changes myself without an LLM. The rest just sit there because one day I thought huh wouldn't it be neat if, and then realized actually I cared more about having that thought than having the result of that thought. Then you end up fighting with different models and implementation details and then it messes up something and you go back and forth about how you actually want it to work, and somehow this is so much more draining and exhausting than just getting the work done manually with some slight completion help perhaps, maybe a little bit of boilerplate fill-in. And yes, this is after writing extensive design docs, then having some reasoning LLM figure out the tasks that need to be completed, then having some models talk back and forth about what needs to happen and while it's happening, and then I spent a whole lot of money on what exactly? Questionably working software that kinda sorta does what I wanted it to do? If I have a clear idea, or an existing codebase, if I end up guiding it along, agents and stuff are pretty cool I guess. But vibe coding? Maybe I'm in the minority here but as soon as it's a non trivial app, not just a random small script or bespoke app kind of deal, it's not fun, I often don't get the results I actually wanted out of it even if I tried to be as specific as I wanted with my prompting and design docs and example data and all that, it's expensive, code is still messy as heck, and at the end I feel like I just spent a whole lot of time actually literally arguing with my computer. Why would I want to do that?
ClawsOnPaws commented on Airpass – Easily overcome WiFi time limits   airpass.tiagoalves.me/... · Posted by u/herbertl
ClawsOnPaws · 2 months ago
Since this is only available for mac, couldn't this fairly easily be solved with shortcuts?
ClawsOnPaws commented on Show HN: EchoStream – A Local AI Agent That Lives on Your iPhone    · Posted by u/shuhongwu
ClawsOnPaws · 2 months ago
Not available in my region, apparently.
ClawsOnPaws commented on Google battling 'fox infestation' on roof of £1B London office   theguardian.com/uk-news/2... · Posted by u/pseudolus
ClawsOnPaws · 3 months ago
If those foxes were to spontaneously combust, then Google would have a Firefox problem.
ClawsOnPaws commented on Ask HN: My son might be blind – how to best support    · Posted by u/tkuraku
jesterswilde · a year ago
Learning to understand the world around you via clicking isn't a natural or easy thing to do. I can't do it personally but have looked into it. For me the benefits didn't seem worth the time investment (plus I was older when I looked into it.)

Learning to click to understand what is around you is, IMO, a viable thing to look into for your kid and decide if you want to undertake that training. Daniel Kish is the name of the guy most famous for it and would be a decent place to start looking.

An amusing anecdote and a bit of blind throwing shade a blind: https://youtu.be/u-7w3m7fhl4?t=326

ClawsOnPaws · a year ago
Fellow blind person here, adding my own anecdote. I click and echolocate. I have two different kinds of clicks. A soft click for very immediate surrounding which I can do rapidly if I need to, and a loud click for figuring out large spaces which I don't use very often for relatively obvious reasons. They're quite helpful for me and especially in new unfamiliar spaces it's almost a reflex that happens on its own unless I consciously try to stop it for social reasons. Just to add another datapoint. What works for one might not work for another, so there's a lot of trial and error involved in figuring out what works and what doesn't. This can be very frustrating sometimes but sympathy will go a long way.

Something I wanted to add, maybe this thread in particular isn't the best place for this but in general, I'm very lucky that my parents did not prevent me from doing things that others may have. For example, I climbed trees, rode a bike, and generally tried to do all of the things my sighted peers were doing. Naturally there were accidents, but not preventing me from doing those things, not preventing me from learning my limits, learning my balance and physical control, getting hurt and getting back up, I believe were absolutely vital to making me the person I am today. I imagine as a parent this can be very stressful or worrying, but I honestly do not believe I would be as independent now if I wasn't allowed to do those things back then. So unless it is absolutely certain that this is something that they will not be able to do at all, maybe consider letting them try it. It will absolutely help confidence, self worth and skills for later independence that are very, very, very badly needed and very easily missed. I'm not a parent however, so of course take this with a grain of salt. My experience may be slightly biased here.

ClawsOnPaws commented on Where are programming languages created? A zoomable map   pldb.io/blog/whereInnovat... · Posted by u/marinesebastian
pentacent_hq · a year ago
Elixir is probably the most prominent language that originated in Brazil (at Plataformatec in São Paulo) but the data CSV lists an address in Texas.
ClawsOnPaws · a year ago
I would have thought that would be Lua, not Elixir?
ClawsOnPaws commented on My daughter (7 years old) used HTML to make a website   naya.lol... · Posted by u/fintler
sideshowb · a year ago
Yet despite their ferocity still cute, they purr rather than roar!
ClawsOnPaws · a year ago
I had the opportunity to pet a cheetah last year, and I think that is one of those things that will stay with me for the rest of my life even if I never get another chance. The cheek rubs, the purrs, just like my cats would do. I haven't been able to stop thinking about it. Yes, I went straight to the cheetah page too. They're awesome! I never much thought about them until that day, but now it's a fascination.
ClawsOnPaws commented on ESpeak-ng: speech synthesizer with more than one hundred languages and accents   github.com/espeak-ng/espe... · Posted by u/nateb2022
dheera · a year ago
Why is the quality of open source TTS so horribly, horribly, horribly behind the commercial neural ones? This is nowhere near the quality of Google, Microsoft, or Amazon TTS, yet for image generation and LLMs almost everything outside of OpenAI seems to be open-sourced.
ClawsOnPaws · a year ago
I'm glad that it doesn't. A lot of us use these voices as an accessibility tool in our screen readers. They need to perform well and be understandable at very high rate, and they need to be very responsive. ESpeak is one of the most responsive speech synths out there, so for a screen reader this means key press to speech output is extremely low. Adding AI would just make this a lot slower and unpredictable, and unusable for daily work, at least right now. This is anecdotal, but part of what makes a synth work well at high speech rates is predictability. I know how a speech synth is going to say something exactly. This let's me put more focus on the thing I'm doing rather than trying to decipher what the synth is saying. Neural TTS always has differences in how they say a thing, and at times, those differences can be large enough to trip me up. Then I'm focusing on the speech again and not what I'm doing. But ESpeak is very predictable, so I can let my brain do the pattern matching and focus actively on something else.

u/ClawsOnPaws

KarmaCake day1631April 25, 2020View Original