CC is, imo, the best. The rest are largely on pair with each other. The benefit of VSCode and Antigravity is that they have the most generous limits. I ran through Cursor $20 limits in 3 days, where same tier VSCode subscription can last me 2+ weeks
CC is, imo, the best. The rest are largely on pair with each other. The benefit of VSCode and Antigravity is that they have the most generous limits. I ran through Cursor $20 limits in 3 days, where same tier VSCode subscription can last me 2+ weeks
I don't use any of these except YouTube (if only I could find the content elsewhere…) and I still pay for them when I purchase anything advertised on these properties because, of course, the companies advertising on Google makes all their customers pay for the free (lol) services. All advertising expenses are included in the price of the products, even if you never saw any ads.
We could easily charge for each of these services and still have them. Advertising is not necessary at all. It's just a way to make others pay for your services. It's a free riding problem to externalize costs on those who don't partake in the scheme.
Pay your share and don't call free what others will subsidize. Unless if a public service and we collectively agree on the split (vote and taxes, which we can debate publicly)
Nowadays I'm happy to pay, but that wasn't always the case. And I personally think that having an ad tier and fee tier is fine. Serves everyone
> In a science fiction story, if you invented a superintelligent robot and asked it how to make money, it might come up with cool never-before-seen ideas, or at least massive fun market manipulation. But in real life, if you train a large language model on the internet and ask it how to make money, it will say “advertising, affiliate shopping links and porn.” That’s the lesson the internet teaches!
But I think it makes a lot of sense for very popular consumer products. In my honest opinion, I much prefer having services like Google, Youtube, Gmail, Maps, ChatGPT etc exist for free, but with ads, rather than not exist at all. Preferably with an option to pay and remove ads
Nowadays I'm happy to pay for Youtube premium or LLM, but back during my student days I could not really afford it - and I'm glad there was a free tier (with ads)
Setting aside the obligatory HN dig at the end, LLMs are now commodities and the least important component of the intelligence system Apple is building. The hidden-in-plain-sight thing Apple is doing is exposing all app data as context and all app capabilities as skills. (See App Intents, Core Spotlight, Siri Shortcuts, etc.)
Anyone with an understanding of Apple's rabid aversion to being bound by a single supplier understands that they've tested this integration with all foundation models, that they can swap Google out for another vendor at any time, and that they have a long-term plan to eliminate this dependency as well.
> Apple admitted that the cost of training SOTA models is a capex heavy-lift they don't want to own.
I'd be interested in a citation for this (Apple introduced two multilingual, multimodal foundation language models in 2025), but in any case anything you hear from Apple publicly is what they want you to think for the next few quarters, vs. an indicator of what their actual 5-, 10-, and 20-year plans are.
Google and Apple together will posttrain Gemini to Apple's specification. Google has the know-how as well as infra and will happily do this (for free ish) to continue the mutually beneficial relationship - as well as lock out competitors that asked for more money (Anthropic)
Once this goes live, provided Siri improves meaningfully, it is quite an expensive experiment to then switch to a different provider.
For any single user, the switching costs to a different LLM are next to nothing. But at Apple's scale they need to be extremely careful and confident that the switch is an actual improvement
Studio Ghibli, Sora app. Go viral, juice numbers then turn the knobs down on copyrighted material. Atlas I believe was a less successful than they would've hoped for.
And because of too frequent version bumps that are sometimes released as an answer to Google's launch, rather than a meaningful improvement - I believe they're also having harder time going viral that way
Overall OpenAI throws stuff at the wall and see what sticks. Most of it doesn't and gets (semi) abandoned. But some of it does and it makes for better consumer product than Gemini
It seems to have worked well so far, though I'm sceptical it will be enough for long
Gemini Pro neither as is nor in Deep Research mode even got the number of pieces or relevant squares right. I didn't expect it to actually solve it. But I would have expected it to get the basics right and maybe hint that this is too difficult. Or pull up some solutions PDF, or some Python code to brute force search ... but just straight giving a totally wrong answer is like ... 2024 called, it wants its language model back.
Instead in Pro Simple it just gave a wrong solution and Deep Research wrote a whole lecture about it starting with "The Geometric and Cognitive Dynamics of Polyomino Systems: An Exhaustive Analysis of Ubongo Puzzle 151" ... that's just bullshit bingo. My prompt was a photo of the puzzle and "solve ubongo puzzle 151"; in my opinion you can't even argue that this lecture was to be expected given my very clear and simple task description.
My mental model for language models is: overconfident, eloquent assistant who talks a lot of bullshit but has some interesting ideas every now and then. For simple tasks it simply a summary of what I could google myself but asking an LLM saves some time. In that sense it's Google 2.0 (or 3.0 if you will)
I'm trying to create a comprehensive list of English standup specials. Seems like a good fit! I've tried numerous times to prompt it "provide a comprehensive list of English standup specials released between 2000 and 2005. The output needs to be a csv of verified specials with the author, release date and special name. I do not want any other lecture or anything else. Providing anything except the csv is considered a failure". Then it creates it's own plan and I go further clarifying to explicitly make sure I don't want lectures...
It goes on to hallucinate a bunch of specials and provide a lecture on "2000 the era of X on standup comedy" (for each year)
I've tried this in 2.5 and 3. Numerous time ranges and prompts. Same result. It gets the famous specials right (usually), hallucinates some info on less famous ones (or makes them up completely) and misses anything more obscure
I'm sure llm providers will also figure it out in due time. Consumer products are generally a good fit for ads, even if it takes time to reach full potential
I can only assume the Aluminium OS would aim to do the same
Then I assume they'll roll it out further
For better or worse, I do own Pixel 10
So far, the circular financing from Nvidia has been peanuts for the company. It's roughly equal to giving 5% discount on hardware, not a big deal when the profit margin is 70%. Trying to prop up new neoclouds and competition is a good idea.
As I understand it, the OpenAI investment was much bigger effective discount but still safe because Nvidia invests gradually in installments only when OpenAI invests in data centers: tit for tat. Maybe OpenAI wanted to get the money now and invest it later, as they seem to be running out of cash.
Nvidia might have wanted more exclusivity/attachment. And OpenAI still seems to have no problem raising money. So maybe there was just a commitment mismatch
Pure speculation though