Readit News logoReadit News
elromulous · 2 years ago
In general I agree that the tensor hardware has been mostly a disappointment. I mean this more in general performance than in ML.

In ML I think the story is a little different. Hardware design/production pipelines are measured in years. The LLM "revolution" is relatively recent. For the Pixel8, I imagine some folks at Google had to make the really tough decision of whether to run weaker, smaller models on device, or run the giant LLM models in the cloud, giving much better results, albeit slower (and with ofc the requirement for an Internet connection).

silisili · 2 years ago
99% of smartphone consumers don't care about 'ML performance' as a primary concern. People care how fast apps open, and battery life. From the perspective, the tensor chips are pigs that nobody is excited about.
senkora · 2 years ago
iPhones have all sorts of little niceties that are directly enabled by good ML performance. For example, automatic OCR of text within images, even on websites. I think the facial recognition for grouping photos by person is also done locally by the ML chip.
stavros · 2 years ago
I very much do care if my phone understands my questions accurately when I ask them, and doesn't return irrelevant results, though. Nobody says "yeah Siri is crap, but have you seen that power usage?!".
matthewmacleod · 2 years ago
That's a bit reductive. They also care if their apps work _well_ – if their pictures look nice, if their dictation is responsive, if their assistant understands queries, if OCR is accurate… which is all about "ML performance".
alpaca128 · 2 years ago
I do care about my phone doing OCR and other tasks locally without Google, and tensor chips are the reason this can be done without draining the battery too much.
madeofpalk · 2 years ago
You're right that 99% of smartphone consumers don't care about 'ML performance', but they do care about autocomplete being good and fast. Or photos doing object detection to create stickers for messages.

If you're able to offload these 'everyday' tasks to a lower power chip, you also extend battery life. Something they care about :)

RantyDave · 2 years ago
Unless they can do something that was going to have to be run on the CPU - and use less power doing it. I think we're rapidly getting to the point where this AI stuff is no longer optional.
vineyardmike · 2 years ago
I totally imagine that they intended to run models locally when this phone was on a whiteboard in an engineering meeting but…

(1) the models team just can’t/wont trim the models, instead focusing on bigger and better. This is probably doubly true now that the AI wars have taken over their focus

And

(2) they realize they probably need to ship these features to every non pixel phone too.

ddalex · 2 years ago
... why do they need to ship these features to non pixel phones ? It's Google's special software sauce that make people buy pixels. Google doesn't ship advances Photos app tools on non pixel phones
pmontra · 2 years ago
Why point 2 is a problem? They can run locally if there is a chip that can do that and offload to cloud if not, and if there is a connection fast enough to transfer the data both ways in a reasonable time. I'm thinking about photo and (especially) video processing. Translating would probably be OK even on show connections. Furthermore, IMHO shooting photos and videos in the middle of nowhere with no connection is a more common use case than translating something there.
cesaref · 2 years ago
I think this is the point - each design team and company bet on what they believe their hardware is likely to need to perform. The performance vs power consumption trade off is really tricky when you have to guess what the software landscape will look like a few years out. Get it right, and you have a winner, get it slightly wrong and you either have 'poor battery life' in reviews, or you offload stuff which ideally would be running locally (which is the suggestion in this case).

I'm not sure I want LLMs on my phone anyway - the situations where you want a 'digital assistant' are generally ones where you've got connectivity, so offloading seems like a sensible fit for me. Of course i'm making that assumption and i'll probably look silly, but that's progress right?

Of more interest is the heavy lifting for image processing to make the phone camera not suck (it's all smoke and mirrors with phone cameras, and more inference on device helps).

injidup · 2 years ago
I'd love to have an LLM on my phone when no connectivity. There are many situations where a personal assistant would come in handy.

In the wilds if Canada somewhere:

Hey google. Lead me through the steps to create a splint and stop the bleeding. My buddy just got attacked by a bear and is bleeding out.

What? That's not a bear? It's a racoon ? Oh...anyway.

Lead me through the steps to start a fire with wet wood and how do I prepare a bear, I mean racoon, carcass for good eating. I have some old copper wire and a magnet. I need to make a phone charger. Lead me through it. Also make sure the fire is smokeless as we don't want to get discovered and sent back to the supermax.

croes · 2 years ago
Speaking of disappointing.

What happened to the Soli radar chip in Pixels?

SketchySeaBeast · 2 years ago
You ever go all in on a technology that tries to sell your users on a big bezel in exchange for being able to use their phone without touching it and then have a global emergency come around that largely removes the only reliable way to authenticate the user without them touching the phone?
crest · 2 years ago
I’m sure Google had to think long and hard if they wanted all those queries forwarded to their data centers to analyse (enrich their user profiles).
wmf · 2 years ago
They could process stuff locally and also send data to the cloud.
jeffbee · 2 years ago
The actual experience of using Pixel ML features is extremely good, so it feels like the implementation details are inside baseball. Just like the last line of the article says: they are class-leading features. Speech-to-text and the reverse are the most noticable improvements over iOS.
peterhull90 · 2 years ago
Can anyone just clarify for me what these AI / Machine learning chips are? As far as I can tell they are general purpose microprocessors with some added instructions which accelerate operations that are commonly used (matrix multiplication possibly?) but there's a lot I don't quite understand because some of the info is marketing and some of it it heavy technical stuff and my knowledge falls in between!
electricships · 2 years ago
here is my good deed for the day:

modern AI is just vector multipication. any AI chip is just 10,000s of very simple cores which can do vector float operations and little else. this also entails clever trade offs of shared cache and internal bandwidth.

(as a thought experiment, consider a naive million by million matrix multipication. this will take a single cpu about 1 year! how do we reduce this to 1s ?)

the end

Symmetry · 2 years ago
Nowadays AI chips are specialized in not just vector multiplication but matrix multiplication. Just as moving from scalar math to vectors brings savings in control and routing logic, moving from vector to matrix does the same. Taking a result from a floating point unit and moving it to a big, multi-ported register file and then reading it out again to feed into another floating point unit is often a much bigger draw of power than the multiplication or addition itself and to the extent you can minimize that by feeding the results of one operation direction into the processing structures for the next you've got a big win.
automatic6131 · 2 years ago
>vector float operations and little else

I thought they were generally int8 or int16 vector multiply adds and occasionally float16 added in.

Deleted Comment

exikyut · 2 years ago
As someone with a lot of interest in but no fluency with chip design, or the dividing and conquering of math within silicon, for that matter, how would you multiply a 1m² matrix?
bigbillheck · 2 years ago
> naive million by million matrix multipication....how do we reduce this to 1s

A matrix of that size in single precision is 32TB, a better question is how do you store it?

tyingq · 2 years ago
This article seems fairly straightforward: https://maelfabien.github.io/bigdata/ColabTPU/
atorodius · 2 years ago
Chips take a lot of time from design to roll out. Giant LLMs being something you want to serve is a very recent development and it will take time for the deployed hardware to catch up, no surprises here.
elAhmo · 2 years ago
Other than bringing up Google Assistant with Bard, the article doesn't mention giant LLMs at all. It lists features such as Gboard proof-reading that are not done on device, as well as the chips heating up when doing some of the general tasks such as downloading, so from that regard, it seems that most AI tasks are not done on the device. Which was the promise sold with these chips.
moffkalast · 2 years ago
The NCS2 looks really pathetic now with its 512 megs of RAM. Nobody seemed to expect what the demands will be just a few years ago.
blackoil · 2 years ago
Does the article has anything to say apart from clickbait? From article, Google is working on a 3nm chip with TSMC, so it is surely not abandoned. Current chip sometimes run hot, that's bad but not an existential crisis. Not all AI things can be on-device, was that ever a promise? Is that even feasible? Is any competitor doing better? LLMs with 100s of billions of params can't be run on a beefy desktop, so why is it a surprise or concern that bard can't run on Pixel.
Roark66 · 2 years ago
I came here to say exactly the same. Where is any substance in this article? What are the capabilities of the chips? How many Tflops can they do?

I'm not buying it. I have a couple of Google's (mini) Edge TPUs doing object recognition in my Cctv server. They have been working 24/ 4 for about 2 years and are consistently delivering on the 4W/4Tflops promise(although for some reason they stopped advertising this number focusing on benchmarks instead).

I for one wish Google would have us their big (or even medium) sized TPUs to play with on an add on card rather than in their phone.

brucethemoose2 · 2 years ago
> I for one wish Google would have us their big (or even medium) sized TPUs to play with on an add on card rather than in their phone.

Yeah, no kidding. I know they want to push GCS, but what a boon that would be for TPU support in ML projects.

dacryn · 2 years ago
headlines of google products being end of life and perception of being killed off generates a lot of clicks, and it is so tiring
gorbypark · 2 years ago
While intentionally a bit click-baitey, I think the author is saying that features _using_ the ML tensor cores are on life support. They announced/hinted at that a bunch of features would use on-device ML and in the end they are just using the cloud like they always have.
danieldk · 2 years ago
As far as I understand, some inference is done on-device. LLMs and diffusion just changed the field in the last year and it takes time for hardware to catch up (+ work to reduce model sizes). So, it's just hard to run the latest models on-device. So you either end up doing it online (Google's preference) or having weaker models (Apple's preference).

Deleted Comment

fooblaster · 2 years ago
The "on life support" title has absolutely no justification and is just a sad editorialization with no evidence from insiders at google. They could have just as easily said "tensor chip isn't so good with tensors" and gone with a more factual headline.
dade_ · 2 years ago
The image Google has created in the market of their products is, stellar success or death. The article states and provides examples where the chips are completely missing the mark (that Apple is setting) and that the chips aren’t a stellar success, so life support is probably a group of people trying to keep the project alive.

Deleted Comment

dangus · 2 years ago
“Tensor chip isn’t so good with tensors”

“My car isn’t so good at driving.”

Kind of sounds like life support.

i5-2520M · 2 years ago
Galaxy phone not good for space travel.

Snapdragon can't breathe fire.

crest · 2 years ago
As if that had ever stopped one of the big established car companies from building more bad cars.
phkahler · 2 years ago
Maybe if I had better understanding of the Pixel capabilities I may have bought one.

Maybe Google can't market anything because they're so entrenched in the least effective advertising methods out there?

Where are the catchy short commercials on YouTube compelling me to this great tech? Just because everyone knows Google doesn't mean they know what Google has to offer.

jmcgough · 2 years ago
Google is absolutely awful at marketing. Strada was really impressive tech that let me play AAA games almost seamlessly on my old Thinkpad, and none of my friends knew what Strada was because of how it was marketed.
JOnAgain · 2 years ago
Do you mean Stadia? Is it ironic you got the name wrong.

It wasn’t good enough. It purported to be great, but it had random connection issues. Given that destiny, a 1st person shooter, was their flagship game these flaws really jumped out.

You also really needed good internet. Like really good. Mine performed well sometimes when it was hardwired, but lagged regularly over WiFi.

I think these things were fixable, but that thing was dead within a month of launch. Google didn’t get fully behind it, so all the people that launched it got their promos, saw the writing on the wall, and transferred out.

theshackleford · 2 years ago
Everyone In my circle knew what stadia was, and knowing it was google was enough for them to avoid it. A choice that was ultimately the correct one.
justinsaccount · 2 years ago
They might have heard of Stadia.
danieldk · 2 years ago
Maybe if I had better understanding of the Pixel capabilities I may have bought one.

I am not sure if you'd want one? I returned the Pixel 7a after a few days. Stock Android is great, the camera is top-notch and goes head-to-head with iPhone. But the fingerprint sensor is mediocre (they use an optical sensor) and charging is very slow. The issues make it not-great as a daily phone. People have reported similar issues with the 7.

digikazi · 2 years ago
That was my experience as well. Generally speaking the Pixel 7a is a competent phone with one (big) flaw. The battery life is truly terrible, and the charging speeds are glacial. I was honestly expecting more, especially as its advertised as Google's best. The forums are full of people complaining about it and the responses are almost the same: Turn off 5G!! Turn off wifi scanning! Turn off bluetooth scanning too! Dial down the screen refresh rate!

Returned the phone and got an iPhone 13 instead; and as painful as it was switching to iOS after 13 years of Android, I don't regret my decision. The iDevice comes with its own niggles, but at least I don't have to worry about battery life (yet). Maybe google will figure it out by the time the Pixel 8a comes out.

phkahler · 2 years ago
You may be right. If I had known the flaws I might not have bought it. But I still need to know the upside to even consider it, and that falls on Google, not me.
dna_polymerase · 2 years ago
In Germany, they ran an ad campaign for those devices. They gave some well-known influencers some budget to make a spot by themselves and ran that on YouTube. Those spots were the bottom of the barrel. No idea who runs their advertising, but they need to fire all of them.
moffkalast · 2 years ago
Ironic, given that they run the largest ad network on the planet. They could advertise others, but not themselves.
borissk · 2 years ago
My experience using Pixel 7 is it has the smoothest UI of any Android phone I've tried - including flagships like S23 and Xiaomi 13T.

Also the author assumes Google wants to do on-device AI processing, but given Google's desire for user data they may be uploading pictures and videos in the cloud just for the sake of having the data (and maybe use it to train say self-driving AI models ).

Al-Khwarizmi · 2 years ago
My experience with my Pixel 6 Pro is also amazingly snappy UI and great software, but absolutely crap battery, the worst of any smartphone I've ever had (and I suppose that's the chip's fault, because the battery itself has a normal amount of mAh). The model is also quite unreliable, to the point that I sometimes disable 5G and 4G because they do more harm than good.

I could live with the modem issue (3.5G is OK for 99% of my smartphone usage anyway), but having to care so much about battery running off is a quite important drawback for daily life. The fact that they seem to be keeping the same path with the chip makes me think that I'll have to look for another brand for my next phone.

KJBweb · 2 years ago
Got the 6 Pro too, me battery life isn't great either, but I WFH so I just mitigate with a power bank or spot charge for the times when I am on the go all day.

Never had issues with reliability though, I've been really impressed with the hardware aside from the battery.

dd_xplore · 2 years ago
I haven't used 3G/3.5G in over a decade now(almost). Even 90% of our carriers have disabled 3G services , some are 4G/5G only and others support 2G/4G/5G