Readit News logoReadit News
rmckayfleming commented on The plan-execute pattern   mmapped.blog/posts/29-pla... · Posted by u/surprisetalk
rmckayfleming · a year ago
"I feel uneasy about design patterns. On the one hand, my university class on design patterns revived my interest in programming. On the other hand, I find most patterns in the Gang of Four book to be irrelevant to my daily work; they solve problems that a choice of programming language or paradigm creates."

My relationship with design patterns changed when I stopped viewing them as prescriptive and started viewing them as descriptive.

rmckayfleming commented on Sega Saturn Architecture – A practical analysis (2021)   copetti.org/writings/cons... · Posted by u/StefanBatory
MBCook · a year ago
The cartridge ended up being a huge sore spot too.

Nintendo wanted it because of the instant access time. That’s what gamers were used to and they didn’t want people to have to wait on slow CDs.

Turns out that was the wrong bet. Cartridges just cost too much and if I remember correctly there were supply issues at various points during the N64 era pushing prices up and volumes down.

In comparison CDs were absolutely dirt cheap to manufacture. And people quickly fell in love with all the extra stuff that could fit on a desk compared to a small cartridge. There was simply no way anything like Final Fantasy 7 could have ever been done on the N64. Games with FMV sequences, real recorded music, just large numbers of assets.

Even if everything else about the hardware was the same, Nintendo bet on the wrong horse for the storage medium. It turned out the thing they prioritized (access time) was not nearly as important as the things they opted out of (price, storage space).

rmckayfleming · a year ago
Not just dirt cheap, the turn around time to manufacture was significantly lower. Sony had an existing CD manufacturing business and could produce runs of discs in the span of a week or so, whereas cartridges typically took months. That was already a huge plus to publishers since it meant they could respond more quickly if a game happened to be a runaway success. With cartridges they could end up undershooting, and losing sales, or overshooting and end up with expensive, excess inventory.

Then to top it all off, Sony had much lower licensing fees! So publishers got “free” margin to boot. The Playstation was a sweet deal for publishers.

rmckayfleming commented on Intel puts 1nm process (10A) on the roadmap for 2027   tomshardware.com/pc-compo... · Posted by u/rbanffy
cj · 2 years ago
I've noticed a lot of people, even ones making $200k+ in salary, are really bad when it comes to making decisions that involve any amount of money.

E.g. I've been in meetings with multiple developers who, if you add up everyone's salary, is well over $1 million/year, debating for way too much time on whether it's worth it to buy a $500/month service to help automate some aspect of devops.

Maybe this wasn't the case for your specific anecdote, but in the scenario I'm describing I got the feeling that a lot of people think about business purchases in the context of their own personal finances rather than in the context of the business's finances. Leading people to be extremely cautious with things like a $10k purchase that would be "expensive" if purchased as an individual and "cheap" if purchased as a company.

In those cases, getting an exec to come in and pull the trigger can help. The exec is used to looking at big picture budgets/strategy, which IC's aren't. (Although I'm sure someone here can come up with another anecdote proving that wrong)

rmckayfleming · 2 years ago
Yea, people generally don't think about these things. People were genuinely surprised to find out that 70% of our expenses as a software business were... salaries. The next largest category? Rent. Everything else was effectively a rounding error.
rmckayfleming commented on Homes need to be built for better internet   theverge.com/2023/12/8/23... · Posted by u/rntn
rmckayfleming · 2 years ago
I just bought a house this year, the previous owners lived in it for about 2 years from brand new. Thankfully it has an ethernet drop to every room upstairs, and in reasonable places on the main floor. Far better than the situation at the last place I lived in.

But annoyingly, it's all Cat5e! Sure, it's fine for gigabit, but now that I have proper networking equipment, I wish they'd have spent the extra few bucks and put in Cat6A or Cat7 so that I could get 10Gbps to my office from the rack in the basement...

rmckayfleming commented on Stable Diffusion:Real time prompting with SDXL Turbo and ComfyUI running locally   old.reddit.com/r/StableDi... · Posted by u/belltaco
pbalcer · 2 years ago
On my machine with AMD RX 7900XT, it takes ~0.17s per image. Are you using SD Turbo Scheduler node?
rmckayfleming · 2 years ago
Have you used the 7900XT with LLMs at all?
rmckayfleming commented on Nvidia's earnings are up 206% from last year as it continues riding the AI wave   arstechnica.com/gadgets/2... · Posted by u/PaulHoule
automatic6131 · 2 years ago
AMD's Radeon group has, by any definition, kept up. In gaming, in the tier of cards that people actually buy, they trade blows with NVidia.

But making a GPU is hard. Making a good GPU is even harder. This becomes a chicken and egg problem. If NVidia's hardware has quirks, and you're a game developer, you optimize your game for Nvidia's hardware. If someone else tries and builds a GPU, they discover that many, popular, games run like sh*t. Because of tens of years of strange workarounds to ship a game, drivers rewrite commands for specific games, game devs abuse the directx spec or do it wrong.

However, Nvidia set out 10-15 years ago to corner the professional general purpose GPU compute market with a software layer called CUDA. Software was written and optmised for CUDA, and not any other generic graphics library for reasons I don't know (I'm not a graphics developer, just a gamer). So now Nvidia enjoys a moat in gpGPU (as it was called).

rmckayfleming · 2 years ago
I do wonder how much of an impact the PS5 and Xbox Series have on this though since they both have RDNA2 GPUs. Consoles tend to face higher optimization pressure, especially as the generation wears on. It might be why AMD has kept so competitive in gaming as of late.
rmckayfleming commented on The Revival of Medley/Interlisp   theregister.com/2023/11/2... · Posted by u/samizdis
rmckayfleming · 2 years ago
It's more that it's easier to use C for two reasons. The first is that C is really popular and therefore pretty portable. It's a lingua franca. The other is that because the hosts are largely defined in C, it's easier to interact with. Of course, the host doesn't actually "speak C", it follows some form of ABI. But the reality is that implementing each ABI is non-trivial and you can avoid a lot of pain by just using the host's C compiler/linker/etc. that implements it for you.
rmckayfleming · 2 years ago
Like, SBCL compiles and assembles directly to machine code, that's very much the Lisp way. But SBCL has a lot of C that's involved in getting the SBCL image running and interacting with the host.
rmckayfleming commented on The Revival of Medley/Interlisp   theregister.com/2023/11/2... · Posted by u/samizdis
uticus · 2 years ago
> Developers from Fuji Xerox wrote a portable VM in C to run the environment on different host platforms, called Maiko.

I'm always confused by the relationship of C and Lisp(s). Here the VM is written in C. Yet elsewhere there seem to be at least one good example of a Lisp compiler written in Lisp [0]. What was the reason for writing Maiko in C, versus Lisp "all the way"?

[0] "The first complete Lisp compiler, written in Lisp, was implemented in 1962 by Tim Hart and Mike Levin at MIT, and could be compiled by simply having an existing LISP interpreter interpret the compiler code, producing machine code output able to be executed at a 40-fold improvement in speed over that of the interpreter.[19] This compiler introduced the Lisp model of incremental compilation, in which compiled and interpreted functions can intermix freely. The language used in Hart and Levin's memo is much closer to modern Lisp style than McCarthy's earlier code. " https://en.wikipedia.org/wiki/Lisp_(programming_language)#Hi...

rmckayfleming · 2 years ago
It's more that it's easier to use C for two reasons. The first is that C is really popular and therefore pretty portable. It's a lingua franca. The other is that because the hosts are largely defined in C, it's easier to interact with. Of course, the host doesn't actually "speak C", it follows some form of ABI. But the reality is that implementing each ABI is non-trivial and you can avoid a lot of pain by just using the host's C compiler/linker/etc. that implements it for you.
rmckayfleming commented on It's 2023, so of course I'm learning Common Lisp   log.schemescape.com/posts... · Posted by u/behnamoh
rmckayfleming · 2 years ago
I know. I've been spending a lot of time with CL, Scheme, and Clojure the past few years, and the ideal Lisp is some combination of them all. There are aspects of each that I miss in the others. CL has the nicest environment and development story (generally speaking). Scheme feels more refined in the small. And although they can be divisive, I really appreciate Clojure's data structure literals.
rmckayfleming · 2 years ago
CL is the x86 of the Lisps. Successful because of backwards compatibility, but also ugly because of it.
rmckayfleming commented on It's 2023, so of course I'm learning Common Lisp   log.schemescape.com/posts... · Posted by u/behnamoh
maxwelljoslyn · 2 years ago
This nicely summarizes some of my frustrations with using Clojure for my master's thesis. I'm not unhappy with the choice. Clojure allows such a juicy crossover between "everything is a key-value map, mannn" and "If it has :quack key set to true, treat it like a duck" which works really well for entity-component-system game-design-y things.

but the development story in Common Lisp ... and my gawd, the CONDITION SYSTEM ... were things that I sorely missed for the last year. and I'm not even that experienced of a CL hacker. It just grew on me so quickly. If only CLOS and the primitive data types in CL played together more nicely than they seem to.

rmckayfleming · 2 years ago
I know. I've been spending a lot of time with CL, Scheme, and Clojure the past few years, and the ideal Lisp is some combination of them all. There are aspects of each that I miss in the others. CL has the nicest environment and development story (generally speaking). Scheme feels more refined in the small. And although they can be divisive, I really appreciate Clojure's data structure literals.

u/rmckayfleming

KarmaCake day278March 20, 2012
About
Co-Founder and CTO of Chalk.com.
View Original