Readit News logoReadit News
JadoJodo · 4 years ago
> is it a feature focused on search and retrieval, or an assistant that carries out complex tasks?

I'd settle for an assistant that carries out simple tasks.

I don't expect much from my robot assistant, but the number of times throughout the day that she gets it blatantly wrong has become a running joke in our household.

A few recent examples:

    Me, holding my Apple Watch near my face: "Hey, Siri, drive to Kroger <there's one I go to all the time that is 10 minutes from my house>"

    Watch: "Now playing <some band I've never heard of, that isn't in my library>"

    Simultaneously, iPhone: "Getting directions to Kroger <shows results that are ~700 miles away>"
Or:

    Me, to my iPhone: "Hey, Siri, start a yoga workout."

    iPhone: <starts playing a Yoga video on Fitness+, instead of starting the plain Yoga tracking workout I do every day>
Edit:

I just remembered another one. I was mowing the lawn on Sunday and wanted to start an outdoor walk. The mower was off, and I had my AirPods in:

    Me: "Siri, start an outdoor walk"

    iPhone: "Ok, I've cancelled your alarm for tomorrow"

LgWoodenBadger · 4 years ago
I had this experience over the weekend:

Uses Siri on Apple Watch to start a timerUses Siri on Apple Watch to start a timer "Hey Siri, set a timer for 3 minutes" "Sorry, you don't have the Timer app installed"

jonathankoren · 4 years ago
This gets to a problem I’ve had with Siri once it rolled out to multiple devices. Each device has a different set of capabilities. Sometimes this makes sense (eg I don’t expect my TV to be able to place a call. It doesn’t have a microphone, nor a camera.), but many times it’s just bizarre. For instance, I can’t set an alarm on my laptop. It has a clock, yet I can’t do clock things.

It seems like some basic feature (eg clock like features) should just be built in, and where that’s not feasible, perhaps some sort of device handoff, which essentially happens on the watch.

retSava · 4 years ago
When I'm a cynical, I'm thinking bad search and bad implementations (to me) are really good implementations, the difference is that the goal is not my goal.

Examples: * when starting to type, Chrome address bar preferring to go to Google search, instead of the friggin site I've gone to a hundred times already! * Google maps, search seemingly doesn't deterministically remember my searches. If I start typing the same search term I did yesterday, well, odds are I'm looking it up again! Show that. * Windows start menu... oh don't get me started on that. :)

Hamuko · 4 years ago
I'm having trouble getting Siri to even understand me. I have a HomeKit scene called "Apartment lights off" (it turns off all the lights in the apartment surprisingly enough) and I often ask Siri to apply that scene. The amount of times Siri completely misunderstands me for something strange baffles me.

"Upon lights off – Sorry, I don't understand."

"Obama lights off – Sorry, still don't get it."

Sometimes it just picks up the "lights off" part, prompting me "Which room?" And when I say "Apartment lights off" again, Siri will then reply with "Sorry, I can't find that scene in the current home." At that point, I need to close the Siri prompt and do it again. If Siri understand me that time, then I will actually get that scene applied.

mlang23 · 4 years ago
I am suspecting the problems you have are due to the name of the scene. It appears certain keywords do overrule locally-defined names. Whenever I put an address book entry for a taxi company with the name "taxi" in it, Siri isnt able to call it because it assumes I want to search for a taxi nearby. A similar thing might be going on with "apartment" and homekit. Frankly, a bit cynical, but by now, I am suspecting some intern put a rule in which contains "apartment" and has a very high priority, or somesuch...

Not directly related to Siri, but VoiceOver icon labels are also getting worse and worse over the years. For example, the weather app. In the beginning, when you scrolled with the VoiceOver cursor through the hourly forecast, it would say "Sonnenaufgang" and "Sonnenuntergang" (Sunrise and Sunset) at the appropriate places. Starting with iOS 12 (or was it 13?) the spoken label changed to "Sonne aufwärtspfeil" (sun arrow up). So someone went in there, decided that the nice description from earlier wasn't good enough, and replaced it with a literal description of the icon, ignoring context altogether.

kevinmgranger · 4 years ago
"All lights to max"

"Which phone number for Max?"

I wish I could train myself to say a different phrase.

seanmcdirmid · 4 years ago
Try using groups instead of scenes. Also, key by room, or even use blanket universals (rather than say apartment).

“Turn on all living room lights” works. Also, “turn off all lights” will flip off everything in our house.

Groups are awesome. I have 3 blinds in the north of living room, so I called the, north west, north center, and north east. But they are in the north group, so I usually put them all up at once by saying “open north blinds.”

adolph · 4 years ago
How has Doonesbury not put Siri out of her misery?

I started using this garbage with the 4S and while there have been under the hood improvements of onboard processing, etc, the core product has never gained the implicit context awareness promised and needed. As far as I can tell, it's a checkbox zombie "feature" that never met the usability threshold to become something for normal people but organizationally aged into slow development maturity.

https://www.computerhistory.org/revolution/mobile-computing/...

amluto · 4 years ago
I’ll bite.

Me to watch: Hey Siri, call so-and-so on Bluetooth

Siri: calls so-and-so with Bluetooth selected but nonetheless the audio comes out the watch until I touch the phone.

Siri on Apple Watch is generally very weak.

vrc · 4 years ago
Recently I went through the trouble to create lists to use Siri to add shopping items. I say, “add mustard oil”. Siri then adds “mustard” and “oil”. No problem, I’m smart. So I create a custom word in my phone and train it on my pronunciation. Try again, “mustard” and “oil”.

Guess I’m going to be shopping for “oil of mustard” then

nanidin · 4 years ago
I have also found Siri’s voice recognition for shopping list items lackluster, especially compared to Alexa. I’m guessing it has to do with Amazon having a better model of actual subjects/products due to having a storefront with said items.

It’s really bad when I’m at the store looking at something in my list scratching my head over what “Silicon Matt to sit rock song” is supposed to be without any way to hear what was actually said or give feedback that what Siri did was wrong.

FabHK · 4 years ago
"Hey Siri, is my phone still charging?" - "The current volume is 43%".
moffkalast · 4 years ago
Siri can't start an outdoor walk, silly. She doesn't have legs.
rasz · 4 years ago
Are you Scottish by any chance? https://www.youtube.com/watch?v=NMS2VnDveP8
JadoJodo · 4 years ago
I am not :/
ubermonkey · 4 years ago
Having multiple things actively listening for "Hey Siri" seems like a problem you've created, honestly.

I have my phone set to only engage Siri with the long button press. It saves a lot of trouble.

robmccoll · 4 years ago
Blaming the user for the lack of simple coordination between devices when they buy into your ecosystem across multiple devices and attempt to use the advertised features doesn't seem like a great product strategy.
Barrin92 · 4 years ago
the entire point of these assistant technologies is supposed to be that they seamlessly integrate with the environment. If you have to do long button presses or avoid this or that you can just open Spotify yourself and click play. Given that the benefit you get out of this stuff is likely marginal to begin with failing in 5% of cases is already way too much.

At this point to me it seems like most of it is basically toy technology without any real benefit unless you're visually impaired or unable to use your hands or something. People went from voice messaging to texting with other people because tactile inputs are often faster, so it's always puzzled me what the point is of talking to machines who can't even understand things properly half the time.

mikestew · 4 years ago
Multiple devices are supposed to coordinate amongst themselves to decide who takes the request:

https://support.apple.com/en-us/HT208472

jordansmithnz · 4 years ago
As a HomeKit user, I was looking forward to iOS 15, thinking there might be some incremental improvements to Siri’s handling of HomeKit commands. It wasn’t bad before, but it had room for improvement.

Unfortunately iOS 15 significantly deteriorated Siri’s understanding of HomeKit in some situations. Here’s a few reproducible examples:

‘Hey Siri, shades down’: sets the shades to 99%, i.e. a 1% adjustment. I now need to say ‘closed’ rather than ‘down’ to close them.

‘Hey Siri, close the living room shades’: responds with the current shade status. I need to switch the ordering to get things working (Hey Siri, living room shades: closed)

Did someone delete a mapping file of common HomeKit voice commands to actions or something? At this point, I’d almost prefer no AI, and just a static command list. I guess I could create that with shortcuts, but with 20 or so accessories that’s a big chore to set up.

nerdjon · 4 years ago
This has been driving me insane since iOS 15.

Heavily invested in HomeKit (to the point of having several HomePods around the house), and it feels like something changed and much of it I can't quite put my finger on.

But a few that I have been able to identify:

- Something has changed about how it handles figuring out which device you are talking too. I have had a HomePod in a different room go off (screwing up the room awareness), or my iPhone going off even when I am right next to a HomePod.

- All of my HomePod pairs (having 2 linked together) switched to the opposite HomePod for which one speaks, and there is no clear way to change this.

- Every action has slowed down, I can't figure out why but most of the time I get "one moment" while it tries to turn in lights or whatever.

- They added the ability for the HomePod to turn on and off the Apple TV. Which is great in theory but I have my own custom TV configuration hooked up to Logitech harmony and it will constantly ask me "do you want to control x or y" (one being my custom and one being the Apple TV depending on the room I am in). I want to disable this but I can't.

Edit: All I really want is a log. Please. A log of automated events that happen (I still don't know why my lights will randomly turn off... something is triggering it and I can't figure out what) and a log of what it thinks it heard and tried to do so I can address the "nothing with that name or function is found" or whatever it says when I say "Turn off the fan" and it doesn't know what I meant, and then I try again and it works.

mattsieren · 4 years ago
Glad to see I'm not the only one. I feel like a bunch of prior working commands have gone missing or are interpreted entirely wrong. Siri doesn't respond properly to "Turn the radiator in living room off" but when you say "turn heating in living room off" even responds with "turning radiator in living room off". As a consumer, this is downright frustrating.
Tempest1981 · 4 years ago
Could this be why?

https://www.engadget.com/ios-15-siri-on-device-app-privacy-1...

"But with iOS 15, many common Siri requests will be processed on-device, so no audio will ever leave your phone."

asiachick · 4 years ago
Homekit sucks in general. Even without Siri the UI is unresponsive, broken, missing obvious features. The UI needs to show at least 3 states (maybe more) but only shows 2, on and off. Often I'll click a light and get ZERO response from the UI. IIUC it's waiting for the light to respond with its state but a UI that gives no immediate feedback is useless. It's infuriating that Apple who wrote the interface guidelines about instant feedback didn't apply them to their own software.
stevedekorte · 4 years ago
“Hey Siri, close the living room shades” works for me. The problem I have is that we have both sun shades and black out blinds and Siri treats the words “shades” and “blinds” as the same word, forcing me to create special group labels for each so it can distinguish them.

That said, it does feel like magic to be able to easily control all the lights and blinds with voice commands and Siri usually gets it right.

drewg123 · 4 years ago
Coming from Android, I used to use the Google assistant in hands free situations. Like when cooking or driving. I've found that Siri is TERRIBLE in these situations, because its more likely to punt to give you some links to click on and say "I found this on the web".

I just asked Google and Siri "what temperature do I need to cook fish to". Google came right back with 145F. Siri put up some stuff she found on the web, and didn't give a verbal answer.

I like almost everything better about the iPhone, but I really wish I could replace Siri with Google Assistant.

Isthatablackgsd · 4 years ago
The only time I found Siri useful is disabling the bluetooth in my MBA. I have a setting that whenever that BT mouse is connected, the trackpad is automatically disabled. Sometime I forgot to bring it with me when I brought the MBA outside or other room. I asked Siri to disable the bluetooth so I can have the trackpad functionality back.
manwe150 · 4 years ago
145F is beyond well-done for fish

Edit: I must refrain myself from changing this comment into a pun about it seeming fishy.

m463 · 4 years ago
related, I read recently that the temperatures were lowered for some meats for internal temperature from 165.

EDIT: ok, here's the chart:

https://www.fsis.usda.gov/food-safety/safe-food-handling-and...

some things are still 165, but some are now 145. ham in some cases is 140.

whywhywhywhy · 4 years ago
I'm convinced they've swapped out the microphones used by Siri for cheaper and cheaper ones over the years because it no longer seems to be able to pick up your requests across the room which it could at the launch of Hey Siri reliably.

Presumably because the whole AI assistant thing kinda turned out to be a fad and not as world changing as people were thinking around the time Siri/Alexa launched.

Turns out the thing it's good for it setting timers not ordering soup like Apple originally envisioned. https://www.youtube.com/watch?v=EP1YAatv1Mc

mettamage · 4 years ago
Edit: forgive my enthusiasm. Voice assistants seen as command-lines make me excited.

No it's not! The issue is that we as developers can't actually innovate on it. Look at what Apple did in its heyday with the App Store. Now it's commonplace, back then it was mindblowingly innovative unicorn goodness (I'm a bit biased :) ). And I understand, you can critique the App Store for many things, but the iPhone would be worse without it.

And they didn't create an "App Store" or any other free extensible model for Siri. They should have, because people are on aggregate more creative than any big company including Apple -- thanks to survivorship bias.

What Siri is actually quite good at is interpreting single words or small sentences. So it would be a perfect spoken command-line interface. I have had genuine situations where I needed to talk and couldn't type, but apps aren't allowed to make use of Siri, so I can't talk to apps, I need to type.

Siri and other voice assistants are amazing when you look at it as we look at a command-line.

jon-wood · 4 years ago
> And they didn't create an "App Store" or any other free extensible model for Siri. They should have, because people are on aggregate more creative than any big company including Apple -- thanks to survivorship bias.

They did, its just not well advertised. Any app on iOS can expose actions to Siri, for example the BBC Sounds app exposes actions to start playing a specific radio station. Those actions are then available from Siri on your phone, and any HomePods that have been granted access to execute actions on your phone.

On your point of looking at it like a command line, those same actions are also exposed to Shortcuts, allowing you to compose them in arbitrary pipelines. The output of one action can be passed into the output of another or even stuffed into a variable for later use.

I've developed for all three of the major voice assistants and Siri is by far the best simply because it completely throws out the window the concept of developing a "voice application", voice is just one possible entry point into an iOS app.

veidr · 4 years ago
Interesting take. I am kinda excited about voice assistants and the command-line style use cases for them, too, except the closed nature of Siri and the dominance of Apple has probably set them back 10 or 20 years.

Yes, if Siri was open and any iOS app could easily make a "voice CLI" to their apps that would awesome. But, that's not the case at all.

I am a heavy user of Siri, too, with an Apple Watch and a HomePod in every room of my house. It's better than nothing. But it is still objectively terrible. It can barely do more than set timers, send text messages, and control the lights and music.

That's not useless, but it's not even close to what I wish I could do with my voice (which is basically what you describe).

redacted · 4 years ago
Ironically enough given its reputation, but this is what Bixby is great at - controlling the _device_ with your voice using short commands - "Open Camera", "Take photo", "Delete last photo" etc all work as expected (as a toy example, far more complicated stuff is possible even before including their Siri Shortcut equivalent)
cpuguy83 · 4 years ago
Siri sucks because it does t understand what you are saying at all, not because there aren't a million things hooked into it.

To be fair, all the assistants are extremely bad at it, Siri just seems to be at the bottom of a bad list.

mlang23 · 4 years ago
Well, looking at the skills offers by alexa, I am not sure if a siri appstore would be worthwhile. 99.9% of alexa skills are totally and utterly useless.
Anonymous4272 · 4 years ago
Can you not use voice typing with swiftkey (or any other keyboard) and open termux? I've done that a few times on android
pessimizer · 4 years ago
I've loved the idea of a verbal commandline, too, especially since watching the Play For Tomorrow episode "Shades[*]." It's a teen brave new worldish sort of thing, but all of the characters are constantly interacting with databases, networks and media (including VR) with this very stylized language, not how you would talk to a person.

That's the problem, really. You have to use a stylized language on a commandline. A GUI is really just giving you a graphical diagram and guide to a really stylized language (if you reduce a GUI to its clickable/typeable areas.) It's very hard to draw an arrow pointing at a picture representing a particular predicate in the very restricted bandwidth of serialized sound.

The reason why "Shades" was so interesting to me was that its future teens clearly all knew the stylized language well, and obviously would have learned it in school. To have that kind of universal training in a certain form of computer interaction would require a universal OS, like what UNIX quickly became.

So maybe this is a conclusion coming off the above at an odd angle, but what I'm actually saying is that commandlines are hard and people don't like them. Visual, and even tactile stuff have the ability to offer so much feedback and so many affordances for amateur operators in comparison to audio that audio can't compete. Audio feedback is all exposition and nothing to hold onto or see. And just like film exposition, it often comes too slowly to keep people's attention, or too quickly for people to understand. It's always going to be easier to grab the knob and turn it, or to find the button and click it.

The major advantage of audio interfaces is that they're cheap both materially (you don't need a screen or a control panel) and for the user (you don't have to stop walking, or sit down, or grab a thing, etc. you just speak.) For me these advantages are actually dominant over innate usability, which is why I also believe with you in the huge potential of audio interfaces as a platform.

I just think that the only way to get there is to build a universal and open verbal protocol, with standardized interfaces for accessing databases, media, and networks. Something that children can (will) learn, and will have a lifetime of use. An opinionated and efficient design moonshot like the web (or the Bourne Shell or Gnu Corutils) turned out to be. A platform whose innovation comes from addition of new ways and not from subtraction of the old ones. A platform that holds onto backwards compatibility as long as is practical, thinking in terms of how many decades to keep things around, not how many releases, or possibly even a closed kernel that will never change built on top of some mathy orthogonal basis system.

* I wish this hadn't been deleted from youtube.

BitwiseFool · 4 years ago
I'm averse to any kind of voice assistant. I'm the kind of person who just finds it faster to type in my query or go hunting for the information I want directly. I also come from a time when voice recognition used to be just barely functional, so I'm biased to think the phone will mishear me. I also have a bias towards thinking that computers aren't meant for natural language questions - I'm still very much a keyword query maker. To this day, I have never asked Siri anything, nor Alexa, Cortana, or whatever Google is calling theirs right now.

I know I can't be the only one, right?

thefounder · 4 years ago
I find Siri great...while driving(to call, play music, change volume and ask for news). And that's the only use I find for it.
ghaff · 4 years ago
No. Even asking the weather, I can get a weather report with far more context in about 30 seconds on weather underground either on my phone or on a computer.

And for places where I can't just go to a computer or phone, e.g. my car, I find Siri frustratingly difficult to instruct to perform specific tasks, especially without taking my eyes off the road or hands off the wheel. Even setting a GPS destination by voice alone is hit or miss much less finding a podcast that hasn't been pre-loaded.

Scarblac · 4 years ago
I stutter, and as a result just don't like talking. I'll never use an interface like this.
m463 · 4 years ago
I wonder if there was an interface that required singing, or possibly it would play classical music while you talk - would it work for you?

I say this because of two things I've heard (possibly misconceptions)

One was that people who stutter can sing ok

the other was the movie "the king's speech" where playing classical music while talking would prevent the king from stuttering.

lostgame · 4 years ago
I was like this up until I got my first Apple Watch.

Maybe because it’s that I’m white and speak english, but it’s dictation is unbelievable and I can even send Facebook messages or texts an average 9/10 times without needing to modify it.

I’ve never - for instance - had an issue where I’ve said ‘set a timer for 5min’ and it’s never worked.

I think everyone’s mileage varies.

_fat_santa · 4 years ago
For me it's that my brain just never got used to asking the assistant to do things. Sure I asked it the weather or for the news when I first got it. But after a while I noticed that if I wanted to know these things I would just pull it up on my computer or phone without even thinking to ask the assistant. I'm sure if I used it longer I would get used to it.
erulabs · 4 years ago
Outside of cooking, I'm with you. I use Siri for setting timers while I have my hands full in the kitchen - and even then - it only allows _one timer at a time_. My phone has multiple processor cores, but it requires the assist of my 15 year old microwave to track both rice -and- veggies...
withinboredom · 4 years ago
I feel the pain with setting an alarm... I have my phone set to 24 hour time. When I say 16:04, it's distinctly different than 04:04. Yet, I tell Siri to set an alarm for 09:40, it sets one for 21:40 like an idiot.
wartijn_ · 4 years ago
In some countries that use 24 hour time notation it's really uncommon to use that in spoken language[0]. When I say four past four in Dutch, it could be 04:04 or 16:04. So even if you've set the clock to 24 hours, that doesn't mean you would actually say sixteen when referring to 16:xx.

You can set the region in the settings, maybe that influences the way Siri interprets time? (I would be surprised if it does, but you never know)

[0]https://en.wikipedia.org/wiki/Date_and_time_representation_b...

smoe · 4 years ago
Would you never use the 24h format when speaking? E.g I’m Swiss and I would often say 16:04 when referring to a precise time like a train departure. But four, quarter past four, half five and quarter to five.
nonameiguess · 4 years ago
I don't have a ton of experience using Siri, but it seems like it might have issues with time in general. I got an email just yesterday with an invite to a Zoom meeting attached, and I was about to just import the ics file into my Apple Calendar but then I noticed the email had a line saying Siri had detected an event and could do it automatically, which it did, but the listing was 5 hours early and scared my wife when she saw it because she thought I had scheduled something for the middle of the day when she was at work.

I have a lot of trouble understanding how plain text in the email saying 6:00 PM Central Time could have been interpreted as anything other than 6 PM Central Time.

It was also annoying that you apparently can't edit a calendar entry. I had to go to DuckDuckGo to search for how to change the time and you need to click and drag the box containing the event. There is no UI to just type in a time. I guess they did this to be able to deploy an identical app on Mac OS and iOS? Neutering desktop applications so they work better on mobile is an increasingly annoying trend.

dgellow · 4 years ago
I face the same annoying issues with Siri when I’m trying to set reminders from the Apple Watch. I often have to repeat 2-3 times then remove the incorrect ones.

But you have worst IMHO in other languages! When speaking German units are spoken before the tenths (fünf-und-dreizig to say 35). That completely confuses Siri when saying dates, hours/minutes, or just big numbers.

nextstep · 4 years ago
Dreissig*

I speak German to Siri everyday and she never has this issue with setting alarms. Maybe it’s a dialectical difference?

1cvmask · 4 years ago
A minute off from the 09:41 of Apple ads and screenshots:

https://www.businessinsider.com/heres-why-the-time-is-always...

mkr-hn · 4 years ago
Workaround: say hours at the end. I just said "hey Siri, set a reminder for oh nine fourty hours" and it handled the time perfectly by setting it to 9:40 AM.
adolph · 4 years ago
You’re holding it wrong — How to blame your users

https://uxdesign.cc/youre-holding-it-wrong-how-to-blame-the-...

suction · 4 years ago
I like Apple in general, but using Siri is the most frustrating experience Apple has ever rolled out to the public. It's absolutely awful and useless in its current state for anything but setting a timer or alarm.

Google Assistant is several galaxies above it in every aspect.

ubermonkey · 4 years ago
Wow, opinions vary here a LOT.

I find Siri very capable and useful. It's surprising to me when I don't get what I want out of it.

dc-programmer · 4 years ago
Kind of related, but has anyone else noticed iOS text autocorrect has gotten notably worse over the years?
deergomoo · 4 years ago
I believe at some point in the last couple of years (iOS 13 maybe?) they switched to a machine learning-based system and yes, it's been significantly worse since.

I cannot put into words just how frustrating it is to have the device change a correctly spelled, correctly chosen word because it thinks I may have meant something else. Double points if I backspace to change it back and it re-"corrects" me again.

throwaddzuzxd · 4 years ago
> Double points if I backspace to change it back and it re-"corrects" me again.

That's just awful design. Another quirk is that it can autocorrect the last word you typed when you press "send". Sometimes I proofread myself and still end up bitten by the autocorrect because it changed something when I pressed "send".

On Android when you backspace once after an autocorrect it reverts the autocorrect. It's not great but still better than what iOS does.

solarkraft · 4 years ago
Love it when it replaces a correctly chosen word while swipe-typing the next few ones because it would fit better in the new context.

Boggles my mind how you can let something like that into prod. The things it leads to are

- Errors that make the message hard to understand that are (here it replaced “that” with “they and “here” with “get”) only caught while proof-reading or not at all

- Having to go back and re-type 3 words for correction.

I check the word I just swiped for misrecognition, not one that was previously correctly recognized, god damn it.

The actual success rate of that piece of garbage is maybe 5%.

That said: Of course I’m happy they have a swipe feature at all, albeit significantly worse than Google’s (about on par with Microsoft’s). But you don’t have to make the feature intentionally worse, do you?

ashtonkem · 4 years ago
The re-correction thing is unacceptable, imho. No predictive text algorithm will be perfect, but why in holy hell would they let it ignore explicit correction is beyond me.
rusk · 4 years ago
It just refuses to learn!

This is inarguably worse than what we had with T9 like 20 years ago, all running without cloud support on a limiting embedded platform.

rusk · 4 years ago
Yes https://discussions.apple.com/thread/252621788?answerId=2554...

If you’re wondering why “Access is denied” then that should inform you about how Apple deal with negative customer feedback these days.

webmobdev · 4 years ago
I always turn off this feature in both Android / iDevices because it can act like a keylogger - the Terms of Services for this allow both Google and Apple to "collect data to improve their services".