Readit News logoReadit News
blt commented on A visual introduction to big O notation   samwho.dev/big-o/... · Posted by u/samwho
leni536 · a day ago
A function f(x) is said to be O(g(x)) if f(x)/g(x) is bounded, that is there is some C so that for every x, f(x)/g(x) < C .

In computer science f(x) is often some complexity function, like number of some specific operations when running an algorithm to completion.

blt · a day ago
This needs to be refined: f(x) is O(g(x)) if there exists some X >= 0 such that f(x)/g(x) is bounded for all x > X.

Otherwise, we cannot say that 1 is O(x), for example.

blt commented on A visual introduction to big O notation   samwho.dev/big-o/... · Posted by u/samwho
samwho · a day ago
So your contention is with the word “always”? It doesn’t always mean worst case? I got told off by someone else for _not_ saying this.

I really just want to find the way of describing this that won’t net me comments like yours. It is very disheartening to spend so much time on something, and really try to do the topic justice, to be met with a torrent of “this is wrong, that’s wrong, this is also wrong.” Please remember I am a human being trying to do my best.

blt · a day ago
The definition of big O notation is pure math - there is nothing specific to analysis of algorithms.

For example: "the function x^2 is O(x^3)" is a valid sentence in big-O notation, and is true.

Big O is commonly used in other places besides analysis of algorithms, such as when truncating the higher-order terms in a Taylor series approximation.

Another example is in statistics and learning theory, where we see claims like "if we fit the model with N samples from the population, then the expected error is O(1/sqrt(N))." Notice the word expected - this is an average-case, not worst-case, analysis.

blt commented on Who Invented Backpropagation?   people.idsia.ch/~juergen/... · Posted by u/nothrowaways
brosco · 8 days ago
There is indeed a lot of crossover, and a lot of neural networks can be written in a state space form. The optimal control problem should be equivalent to training the weights, as you mention.

However, from what I have seen, this isn't really a useful way of reframing the problem. The optimal control problem is at least as hard, if not harder, than the original problem of training the neural network, and the latter has mature and performant software for doing it efficiently. That's not to say there isn't good software for optimal control, but it's a more general problem and therefore off-the-shelf solvers can't leverage the network structure very well.

Some researchers have made interesting theoretical connections like in neural ODEs, but even there the practicality is limited.

blt · 7 days ago
Yes, in most cases the reduction of supervised learning to optimal control is not interesting.

We can also reduce supervised learning to reinforcement learning, but that doesn't mean we should use RL algorithms to do supervised learning.

We can also reduce sorting a list of integers to SAT, but that doesn't mean we should use a SAT solver to sort lists of integers.

blt commented on Honda conducts successful launch and landing of experimental reusable rocket   global.honda/en/topics/20... · Posted by u/LorenDB
bee_rider · 2 months ago
Hondas at least used to (I haven’t kept up) have that great cheap/reliable car reputation… “the Honda of rockets” has a good ring to it I think, haha.
blt · 2 months ago
Awaiting the rocket engine equivalent of the K20.
blt commented on Low-background Steel: content without AI contamination   blog.jgc.org/2025/06/low-... · Posted by u/jgrahamc
blt · 3 months ago
tangentially, does anyone know a good way to limit web searches to the "low-background" era that integrates with address bar, OS right-click menus, etc? I often add a pre-2022 filter on searches manually in reaction to LLM junk results, but I'd prefer to have it on every search by default.
blt commented on Apple introduces a universal design across platforms   apple.com/newsroom/2025/0... · Posted by u/meetpateltech
BitwiseFool · 3 months ago
I suspect ego played a part in Steve Jobs selecting Tim Cook as his successor. Famous CEO's tend to pick a successor that is less charismatic and more risk-averse than they were. CEO's that retire 'honorably', so to speak, don't want someone who will outshine them or make sweeping changes to the brand or the company's organization. In other words, they want to preserve their legacy.

Tim Cook is exactly this kind of executive. While he has done an incredible job with leading the business and operational side of Apple, the public doesn't give credit for that sort of thing. Now imagine if Steve appointed someone just like himself and the business fumbled. Steve would hate for his legacy to be tarnished by appointing a brash successor.

All that being said, for what it's worth, I don't think anyone could have lived up to Steve's reputation. It is quite unfair to Tim Cook that he will always be compared to what people think Steve Jobs would have done.

blt · 3 months ago
IDK, I think Apple creating its own laptop/desktop-class CPU was a pretty bold move with a huge payoff. It's less sexy than introducing an entirely new category of product, but it's not exactly risk-averse either.
blt commented on My favourite fonts to use with LaTeX (2022)   lfe.pt/latex/fonts/typogr... · Posted by u/todsacerdoti
blt · 3 months ago
I always liked the look of ACM journals/conferences more than other venues. Their template uses Libertine, so it is my choice too.
blt commented on Pathfinding   juhrjuhr.itch.io/deep-spa... · Posted by u/sebg
ninetyninenine · 3 months ago
One efficiency update you can make is that if background objects don’t move, then you don’t need to recalculate the path. So check if anything moved before recalculating.
blt · 3 months ago
Sure, but in games sometimes improving the average case is less important than the worst case.

Deleted Comment

blt commented on Odin, a pragmatic C alternative with a Go flavour   bitshifters.cc/2025/05/04... · Posted by u/hmac1282
mcbrit · 4 months ago
To point out something that is a fail: I don't want to hear about how you simulated 10M particles on the GPU without acceleration forces.
blt · 4 months ago
What is an acceleration force?

u/blt

KarmaCake day5025May 2, 2012View Original