Readit News logoReadit News
havercosine commented on Leaving Meta and PyTorch   soumith.ch/blog/2025-11-0... · Posted by u/saikatsg
intermerda · 2 months ago
Do you have experience in both JAX and PyTorch? Why do you prefer JAX?
havercosine · 2 months ago
Not Op. I have production / scale experience in PyTorch and toy/hobby experience in JAX. I wish I could have time time or liberty to use JAX more. It consists of small, orthogonal set of ideas that combine like lego blocks. I can attempt to reason from first principals about performance. The documentation is super readable and strives to make you understand things.

JAX seems well engineered. One would argue so was TensorFlow. But ideas behind JAX were built outside Google (autograd) so it has struck right balance with being close to idiomatic Python / Numpy.

PyTorch is where the tailwinds are, though. It is a wildly successful project which has acquired ton of code over the years. So it is little harder to figure out how something works (say torch-compile) from first principles.

havercosine commented on Bitfrost – LLM gateway 90x faster than Litellm at p99   github.com/maximhq/bifros... · Posted by u/havercosine
havercosine · 5 months ago
Bifrost is the fastest LLM gateway on the market. Built in Go with careful garbage collection, it adds just about 11 microseconds of overhead at 5,000 requests per second (with 4,100 RPS throughput) on a t3.xlarge instance.

The benchmarks are here: https://github.com/maximhq/bifrost/blob/main/docs/benchmarks...

Some features: • Built-in governance and routing rules • Supports over 1,000 models from different providers • MCP gateway included (HTTP, SSE, and console transport) • Out-of-the-box observability and OTel-compatible metrics

havercosine commented on Show HN: I got laid off from Meta and created a minor hit on Steam    · Posted by u/newobj
newobj · 10 months ago
Godot 4.2 / C#. Godot is an amazing engine and I highly recommend it for indie devs. The iteration cycle is a few seconds. IMHO, using Unity will absolutely slow you down, as a designer/programmer. I can't comment on the reality of a 3d game with a complicated art pipeline, however. But for a game like Ballionaire, Unity or Unreal would have been a mistake.

Ballionaire was fun by day 3, the first time a ball dropped and hit a trigger that caused an effect. It was luck a bolt of lightning to me. I am a VERY pessimistic person. So the fact that it felt compelling that early, and like, anyone who saw it could understand it and felt the fun, was a huge sign to me. Unfortunately, not every game has a premise that allows for that. I don't think you could hope for the same kind of feeling while making a 4X for example.

But if you're making a game that ultimately should feel fun from moment to moment, like these kind of quick-play games are, well, I think you can get there quickly and with little work, if the idea is solid. And if you can't get it to feel fun, I would be wondering if the idea is solid, versus it needing more time/polish etc.

havercosine · 10 months ago
A fellow Godot enthusiast here. Love to see Godot being used in commercially successful indie game like this. In 2021-22 time, I tried (unsuccessfully!) building educational video games for maths using Godot and I have fond memories of being in the flow state while working with Godot. IMO Godot fits well with programmer's brain much better than Unity etc.
havercosine commented on Microsoft cancels leases for AI data centers, analyst says   bloomberg.com/news/articl... · Posted by u/suraci
havercosine · 10 months ago
I'm honestly in two minds on this one. On one hand, I do agree that valuations have run a bit too far in AI and some shedding is warranted. A skeptical position coming from a company like MSFT should help.

On the other hand, I think MSFT was trying to pull a classic MSFT on AI. They thought they can piggyback on top of OpenAI's hard-work and profit massively from it and are now having second thoughts, thats better too. MSFT has mostly launched meh products on top of AI.

havercosine commented on Thank HN: My bootstrapped startup got acquired today    · Posted by u/paraschopra
havercosine · a year ago
Paras, as a an Indian founder, I've watched your journey for few years now. You are an inspiration and a thoughtful leader. Your "Mental Models for Startup Founders", is a very well written mirror for every founder to look into.

Hope you get some nice time off and go back with vigour to Turing's Dream now...

havercosine commented on How Might We Learn?   andymatuschak.org/hmwl/... · Posted by u/ColinWright
gofreddygo · 2 years ago
I have rejected Anki and its competitors for learning. I found it shallow and a drag It need high initial investment (prep cards, commit to reviewing everyday) with 0 instantaneous results (a week or 2 in and the cards are still fuzzy). These are superficial problems.

My deeper beef with this method is the complete absence of emphasizing, discovering or forming connections between cohesive things. We're trying to learn, it's a super power to start seeing patterns in what we learn, it forms buckets that we can put new concepts and information in. Without it, the learning is ... shallow.

I found a better way. I map out full concepts to fit on single sheets of printer paper. Frontside has mostly words with lines connecting them or forming groups. The backside is for related drudgery (formulae, dates, numbers, names). I repeat new things everyday till I can reproduce the sheet front and back without any help. And then slowly introduce days of spacing between repetitions.

This is way more satisfying, no tech involved, no algorithms, just hard work and way faster. I do not have any evidence of this working long term. The things I put so much effort in learning to reproduce with such accuracy usually is useful in the short term only. So it works for me.

havercosine · 2 years ago
Andy's collaborator Michael Nielsen has a nice blog post, "using space repetition system to see through a piece of maths"[0]. He makes a point that the idea is to commit more and more higher order concepts to memory. But he does emphasise that Anki is one way to achieve his and a more simpler pen-paper method that you wrote might work.

[0] : https://cognitivemedium.com/srs-mathematics

havercosine commented on GPT-4o   openai.com/index/hello-gp... · Posted by u/Lealen
kenjackson · 2 years ago
GPT-4 already seems better at reasoning than most people. It just has an unusual training domain of Internet text.
havercosine · 2 years ago
I was going to say the same thing. For some real world estimation tasks where I don't want 100% accuracy (example: analysing working capital of a business based on balance sheet, analysing some images and estimating inventory etc.) the job done by GPT-4o is better than fresh MBA graduates from tier 2/tier 3 cities in my part of world.

Job seekers currently in college have no idea what is about to hit them in 3-5 years.

havercosine commented on Hi everyone yes, I left OpenAI yesterday   twitter.com/karpathy/stat... · Posted by u/mfiguiere
dontreact · 2 years ago
Unpopular opinion… but IMO almost all of Karpathy’s fame an influence come from being an incredible educator and communicator.

Relative to his level of fame, his actual level of contribution as far as pushing forward AI, I’m not so sure about.

I deeply appreciate his educational content and I’m glad that it has led to a way for him to gain influence and sustain a career. Hopefully he’s rich enough from that that he can focus 100% on educational stuff!

havercosine · 2 years ago
Disagreeing here! I think we often overlook the value of excellent educational materials. Karpathy has truly revitalized the AI field, which is often cluttered with overly complex and dense mathematical descriptions.

Take CS 231, for example, which stands as one of Stanford's most popular AI/ML courses. Think about the number of students who have taken this class from around 2015 to 2017 and have since advanced in AI. It's fair to say a good chunk of credit goes back to that course.

Instructors who break it down, showing you how straightforward it can be, guiding you through each step, are invaluable. They play a crucial role in lowering the entry barriers into the field. In the long haul, it's these newcomers, brought into AI by resources like those created by Karpathy, who will drive some of the most significant breakthroughs. For instance, his "Hacker's Guide to Neural Networks," now almost a decade old, provided me with one of the clearest 'aha' moments in understanding back-propagation.

havercosine commented on Pakistan cuts off phone and internet services on election day   techcrunch.com/2024/02/07... · Posted by u/moose44
standardUser · 2 years ago
Any test that is so big that it requires shutting down large sections of society is way, way too big. And I mean, WAY too big, like grotesquely so. Just imagining that level of stress stresses me out.
havercosine · 2 years ago
Most countries in South, South East Asia have made exams as a make and break deal for every student. In India, there are so many kids staying away from home in cities which are just exam preparation centres, with routine news of suicides.

Looking back on my life I think we asians have definitely stretched this way too far. Unfortunately, in high & young population countries like ours these exams are perceived as the only non corrupt way of moving out of low income trap. So this will go on :-(

u/havercosine

KarmaCake day181May 25, 2022
About
games, maths, data science, machine learning
View Original