It's still very immature but if you have a mixture of languages (C, C++, Python, Rust, etc.) I highly recommend checking it out.
You are talking like it was 1997.
The typical linux users don't have to do that. Only those who buy unsupported devices on purpose for the challenge to make them work.
Get any of the modern laptops with good battery life, install linux + Elementary OS without any hacks or workarounds (or better yet, i3wm which is the best window manager for laptops), and never look back.
Or do what I do, which is buy $200 dells/thinkpads of ebay, and for anything requiring CPU, just ssh into your home server.
Personally I went a step further and use a lapdock with a samsung phone - acts like a laptop with Termux, and I can do pretty much everything with good battery life, because lapdock battery also charges the phone.
Seriously I would love to switch back to a full-time Linux distro but I'm more interested in getting work done and having a stable & performant platform. Loosing a day of productivity fixing drivers and patching kernels gets old. The M-series laptops have been the perfect balance for me so far.
(1) Executives with emotional attachment to certain leadership styles that are enabled by physical presence,
(2) Interest in the investor class for the commercial real estate market. The business impacted may not be invested in it, but the businesses’ shareholders in sufficient numbers probably are, and so are the influential constituents of the politicians they want favors from, in a time of increasingly naked political corruption and cronyism.
(3) Backdoor layoffs. RTO is unpopular with large swathes of the work force, and people will quit because of it. That’s good for a firm likely to be cutting positions anyway; there’s no need for severance, regardles of scale there’s no WARN Act notice requirement, and if you still have to cut more positions afterwards, it makes it less likely that those cuts will hit WARN Act thresholds. And while the people that quit may not be the ones it would be your first choice to cut, they are the ones that would be most likely to quit in the kind of less-employee-friendly and financially leaner (in real terms) times likely to exist for a while after cuts.
It's far more likely a mixture of (1) and actual results - in-person/hybrid teams produce better outcomes (even if why that's true hasn't been deeply evaluated or ultimately falls on management)
In LLMs that balance shows up as how often the model hallucinates versus how often it says it doesn’t know. If you push toward precision you end up with a model that constantly refuses: What’s the X of Y? I don’t know. Can you implement a function that does K? I don’t know how. What could be the cause of G? I can’t say. As a user that gets old fast, you just want it to try, take a guess, let you be the judge of it.
Benchmarks and leaderboards usually lean toward recall because a model that always gives it a shot creates a better illusion of intelligence, even if some of those shots are wrong. That illusion keeps users engaged, which means more users and more money.
And that's why LLM hallucinates :P
Or we’re just having too much fun making stuff to make videos to convince people that are never going to be convinced.
We agree most problems stem from: 1. Getting lazy and auto-accepting edits. Always review changes and make sure you understand everything. 2. Clearly written specification documents before starting complex work items 3. Breaking down tasks into a managable chunk of scope 4. Clean digestible code architecture. If it's hard for a human to understand (e.g: poor separation of concerns) it will be hard for the LLM too.
But yeah I would never waste my time making that video. Having too much fun turning ideas into products to care about proving a point.
An end-to-end _trained_ model that spits out a textured mesh of the same result would have been an innovation. The fact that they didn't do that suggests they're missing something fundamental for world model training.
The best thing I can say is that maybe they can use this to bootstrap a dataset for a future model.