I implemented the same behavior in a different Google product.
I remember the PM working on this feature showing us their research on how iPhones rendered bars across different versions.
They had different spectrum ranges, one for each of maybe the last 3 iPhone versions at the time. And overlayed were lines that indicated the "breakpoints" where iPhones would show more bars.
And you could clearly see that on every release, iPhones were shifting the all the breakpoints more and more into the left, rendering more bars with less signal strength.
We tried to implement something that matched the most recent iPhone version.
To be sure, is it possible that, on each subsequent iPhone release, the hardware got better at handling weak signals, and thus a mediocre signal for iPhone N was decent for iPhone N+2 and would give great throughput on iPhone N+4?
Possible sure, but wouldn't it be better marketing for the iphone to have better performance on lower bars? Phones are judged for their performance, but network providers for the number of bars they show on the screen.
The comment you’re replying to is incredibly concerning. Is he saying people at Google are purposefully misrepresenting signal strength so they can “compete” with Apple?
Bars really don’t matter. You can have full bars and slow to no internet. You can have one bar but relatively decent internet. Honestly kind of wish the signal display would go away and instead show me when I lose internet.
That is literally what i am observing lately with my provider: i have 2 bars and yet i do not have internet, where as my gf, using the same iPhone model, with a different provider, having 2 bars, has perfect data connectivity.
I build apps at the moment, in addition to the phone's network indicators you really should provide your user with visible and live feedback to indicate whether the servers are reachable because there's so many things that can break down in between. Also programming your app for offline-first is good unless it's critically important the information is either live or absent. We allow offline access by using React Query and storing its caches in user storage.
> And you could clearly see that on every release, iPhones were shifting the all the breakpoints more and more into the left, rendering more bars with less signal strength.
One thing explaining this might be that advancements in antenna design, RF component selection including the actual circuit board and especially (digital) signal processing allow a baseband to get an useful signal out of signal strengths that would have been just noise for older technology.
In ham radio in particular, the progress is amazing. You can do FT8 worldwide (!) communication on less than 5 watts of power, that's absolutely insane.
A friend recently got a (carrier-supplied) phone and has been complaining about how it would often have no reception despite showing a good signal; taking mine to the same areas on the same carrier and doing a comparison, mine was indeed showing no bars on the signal indicator. The difference is, mine predates this stupidity, and I can also see the details in the MTK Engineer Mode app, which shows the actual signal strength --- it was around -140dBm when it was showing 0 bars.
> taking mine to the same areas on the same carrier and doing a comparison
Unfortunately I don't think it's that simple. I've seen one phone simultaneously show significantly different numbers of bars for two SIMs installed in it for the same exact network and operator. After a while they become similar... then differ again... etc.
I have no clue how to explain it yet, but what I do know is that it literally makes no sense with a naive model of how these work, whether you try to explain it as reception or deception.
The phone selects a RAT (radio access technology) and frequency for each SIM slot.
After selecting, each SIM slot is subject to inter freq / inter RAT reselection / handover.
Both are controlled by messages received from the tower (e.g. on 4GLTE, for reselection, System Information messages), though there is an additional constraint: what's supported by/enabled in the phone.
Perhaps one SIM slot was in the connected state and the other was in the idle state at one point. So the reselection logic applied for one and the handover logic applied for the other. There is for example a problem called ping pong handover. Once a phone is switched to a different frequency or RAT, the tower may have the phone be sort of stuck in the new frequency, until the conditions of the previous RAT or frequency improve substantially, in order to prevent the phone being like a ping pong ball between the two. This frees resources that would otherwise be spent on repeated handover-related messages.
Each frequency has its own signal strength (free space path loss, transmit power, one frequency might be on one tower and another might be on another, etc).
This is usually for a good reason - dual sim phones are almost always “DSDS”, or “Dual SIM Dual Standby”. The secondary SIM, because it doesn’t need to make a data connection, parks itself on the lowest-frequency (and therefore usually lowest-bandwidth) connection it can find. Meanwhile, your data-connected SIM is busy trying to stream a video or upload your photos, so it’s using a higher-frequency + higher bandwidth connection, resulting in a lower signal strength.
I highly recommend Network Cell Info Lite app for the network diag. It shows signal strength with all details for each of the SIM modules, shows on a map in real time where are the base station you are currently connected to, and other interesting statistics.
-140 dBm is far beyond no coverage, yeah. -120 dBm is pretty much when LTE stops working (sometimes it can painfully stretch to -123 to -125 but usually not because of noise etc)
Yes, even thermal background noise (the noise level that exists even in complete absence of RF) would be expected to be above -140dBm. It scales with channel size and temperature.
As near as I can tell, the smallest subcarrier 5G can use is 15kHz, the thermal noise floor for a 15kHz channel at room temp (300K) would be -132 dBm.
My guess is whatever chip doing the measurement simply couldn't measure that low accurately, or it reports "nothing detected" as -140 dBm.
"Tests carried out by research group PolicyTracker, and shared with BBC's Morning Live, found that nearly 40% of the time a phone displays the 5G symbol, it is actually using a 4G connection"
I worked for a mobile network company a few years ago, the vibe I got there was that 5G penetration was still years away and that none of the providers were anywhere near ready for it.
Interestingly that company built a bridge of sorts allowing providers to get more life out of their older hard and software, converting e.g. 5G signals to 4G and 4G to 3G (where a signal is for example a phone phoning home telling the provider they used a megabyte of data, or looking up the IP address when calling a phone number)
Also where 2/3/4G network signals were all their own protocols (RADIUS and DIAMETER), 5G is just HTTP. And where for the 3G/4G stuff they had to write their own code to handle the protocols, for the 5G stuff they just used the cURL library. That is, cURL powers 5G networks.
At least there's some merit to that, since many network don't yet use a 5G core (or SIM cards aren't capable of using it), so the definition of when you "are on 5G" is really murky: https://source.android.com/docs/core/connect/5g-nsa
> You know, I don't recall ever seeing 1 bar of signal strength on a smartphone.
I do.
I'm from Germany, land of perpetual EDGEing. Highest total GDP in the EU but can't build a mobile network for the life of it.
Then again we somehow forgot how to run trains and build cars without cheating, so I guess it fits.
Want to see a single bar? Come visit, our carriers aren't on the list with that inflate flag enabled. I guess they didn't get the same memo as the car manufacturers ;D
I feel you. We have stellar coverage pretty much everywhere in NL. Heck, I was recently in a work video meeting in the car, not a single drop. The route included part of this:
Moving to Germany from countries where mobile networks function is traumatic. My welcome experience was USB stick with faulty drivers, balance zeroed immediately because of not activated packet, then sipping expensive 1GB data packets over choppy connections. Of course that was all my fault. The only reliable thing was monthly billing and enforcement of contract length by the telecom. When I heard before arrival "there is no internet in the apartment but you can simply buy USB stick" I had subconsciously felt there will be problems. Fuck, I hate these memories so much. Fuck everything about it and everyone involved.
> Highest total GDP in the EU but can't build a mobile network for the life of it.
GDP per capita (or GDP per square metre) would be a more useful indication here. Otherwise, you could throw a bunch of poor countries together--just for purposes of statistics, and expect a better mobile network?
I also do, I'm Australian. I regularly experience both congestion caused by tower over-subscription as well as traveling waaaay out into the country where there's no reception, even on the Telstra network that boasts better coverage than everyone else by a mile.
I work with cellular BDA-DAS[1] gear sometimes, and I don't recall the last time I looked at the signal strength display on my phone. It has probably been years.
For me: It either works, or it doesn't work. It is either fast-enough, or impossibly-slow. It's very binary, and the bar graph at the top never told me a damned thing about what I should expect.
[1]: Bi-Directional Amplifier, Distributed Antenna System. In theory, such constructs can make indoor cellular coverage quite good inside of buildings that previously had none. In reality it can be... complicated. And while the bar graph doesn't mean anything, I still need ways to see what's happening as I spend hours, days, or [sometimes!] weeks surveying and troubleshoot and stuff. The phone can report things like RSRP, RSRQ, and some other tasty bits instead of just a useless graph -- and from there, I can sometimes make a hand-waving guess as to what I may reasonably expect for performance.
But that stuff is normally pretty well hidden from view.
> It is either fast-enough, or impossibly-slow. It's very binary, and the bar graph at the top never told me a damned thing about what I should expect.
A few months ago, I was in a remote area at anchor on a sailboat, about 6.5 miles from the nearest highway through the swamp, with only a few farms and a handful of houses within that radius. With my phone up in the cockpit of the boat and tethered over WiFi to my laptop, I was able to download a movie. As the boat swung on anchor, the download was occasionally interrupted, but when data was flowing it was consistently 5-10 MB/s over a claimed 5G link; the movie downloaded in much less time than its runtime. I assume I wasn't competing with much other traffic on that tower, wherever it was. So my experience was even more binary than yours.
The phone's signal indicator did seem to accurately indicate when it had no usable signal at all, but beyond that I'm not sure it was providing any useful information. And I'm not sure if it could have told me anything of use other than "connected" or "not connected". The very marginal connection was still faster than I had any right to expect for those conditions.
I had my car break down in remote mountains and that little image had me climb up trees and eventually find a place where I could make a call from. Once I had two bars they could hear me, before that it was the case that they weren't getting what I was transmitting.
I had a very dangerous 1-bar the other day. You see I was in the Canadian wilderness relying on iPhone text-over-satellite, which works well, but only when you have no signal. I needed to relay a message to the rest of my group when suddenly I find myself with one bar of 3G that was completely and totally inoperable. No messages were getting through. But to make matters worse, since my phone thought it had a bar, it wouldn't activate satellite. I tried every setting and then for 20 minutes hiding behind various rocks to try to get my one bar to go away when finally I found a spot that would let me satellite text again.
Try going into a Home Depot. I don't think I've ever found one where aside from fairly near the front I've had other than 0 or 1 bars, across a variety of phones and carriers and in neighborhoods where the signal outside the store was strong.
The net is telling me this is because of the aisle after aisle of tall metal shelving and the building itself also has a lot of metal in the construction.
It is quite annoying when you are trying to use the Home Depot app to look up something.
I’ve always just blamed the extreme bloat of the web and lack of design around poor connections for 2bar lack of performance. HN usual works fine on but that’s about it for sites I visit.
Consider yourself blessed, the one place in my neighborhood where I get one bar on LTE is the same place I once was repairing my car. Awful experience but the rest of the subdivision is fine.
For some reason, I've spent several weeks (across a few years) in Italy with exactly one bar of signal strength on an iPhone when roaming on Vodafone.
It must actually be tricky to space out towers that sparsely without creating any obvious coverage gaps, but if anyone is up to the task, it's certainly Vodafone (let's not talk about the actual service quality, though).
Android phones show 1 bar pretty reasonably and fair. To illustrate this, I have 1 bar on both my SIM modules right now, which translates to the -125 dBm signal on both. So the connection is up, but it is borderline low.
Our house is in kind of a hollow despite being in a city, and I (and guests - all networks seem just as bad) get one bar basically all the time at home.
Phone calls are hit-and miss without WiFi calling switched on.
My house is also like this. No signal on any carrier, and same with one of my neighbors but not neighbor on the opposite side.
Looking at satellite view it is clear that the local municipal water tower is between a huge cell tower near the highway and my house. All carriers seem to lease that same tower only in my area.
It wasn’t like this when I moved in but I guess the carriers consolidated on that big tower near the highway about two years ago.
I see it all the time driving through the country. Probably a dozen times just today driving through the american east coast. I agree that two bars is the bare minimum for any functionality though.
Heh, my phone consistently reports 1 bar inside my apartment within a major metropolitan area. Indeed binary, because it works enough for the few times I actually take calls not on wifi.
I would assume that this was a carrier request/demand that got filtered down to some poor employee that had to implement it. There’s a linked bug, but the bug is restricted.
IIRC this really took off with the antennagate fiasco on the iphone 4. I was working for Verizon at the time and this was also the first one we were able to sell. I forget who it was that did it but I believe it was Apple in response to people "holding their phone wrong" so they bumped everything up a bar so you couldn't tell. There was a lot of competition at the time but also all the androids had better margins so they wanted us to sell those instead.
Heh, funny. I recently implemented a countdown for a teleprompting app and that's exactly what I ended up doing to make the countdown "feel right".
The countdown in question doesn't display fractions of a second so it would immediately switch from "5 seconds left" to "4 seconds left" which just doesn't feel right. Adding 0.5s solved the issue.
If you're counting up, round down. If you're counting down, round up. A human expects the count to finish at precisely the moment we get to the last number in the sequence (zero, for counting down). Do a count in your head to see what I mean.
Apple chose a compromise by rounding to nearest, for it to "feel good", but you lose the ability to exactly predict when the timer ends as a human. Typical Apple.
From looking at the bottom of the linked post (which says it was edited, not sure when in relation to your comment), it sounds like they wanted something that worked across arbitrary times split across units (hour/minute/seconds) without having to handle carry-over. I'm not sure I would choose to alter the times themselves over making the math a bit more complex, but the author has obviously thought about this a lot more than me, and it's nice that they at least considered that alternative.
I remember the PM working on this feature showing us their research on how iPhones rendered bars across different versions.
They had different spectrum ranges, one for each of maybe the last 3 iPhone versions at the time. And overlayed were lines that indicated the "breakpoints" where iPhones would show more bars.
And you could clearly see that on every release, iPhones were shifting the all the breakpoints more and more into the left, rendering more bars with less signal strength.
We tried to implement something that matched the most recent iPhone version.
So, game-theoretic evil?
Deleted Comment
One thing explaining this might be that advancements in antenna design, RF component selection including the actual circuit board and especially (digital) signal processing allow a baseband to get an useful signal out of signal strengths that would have been just noise for older technology.
In ham radio in particular, the progress is amazing. You can do FT8 worldwide (!) communication on less than 5 watts of power, that's absolutely insane.
The signal strength measurement is actually standardised: https://en.wikipedia.org/wiki/Mobile_phone_signal#ASU
Unfortunately I don't think it's that simple. I've seen one phone simultaneously show significantly different numbers of bars for two SIMs installed in it for the same exact network and operator. After a while they become similar... then differ again... etc.
I have no clue how to explain it yet, but what I do know is that it literally makes no sense with a naive model of how these work, whether you try to explain it as reception or deception.
After selecting, each SIM slot is subject to inter freq / inter RAT reselection / handover.
Both are controlled by messages received from the tower (e.g. on 4GLTE, for reselection, System Information messages), though there is an additional constraint: what's supported by/enabled in the phone.
Perhaps one SIM slot was in the connected state and the other was in the idle state at one point. So the reselection logic applied for one and the handover logic applied for the other. There is for example a problem called ping pong handover. Once a phone is switched to a different frequency or RAT, the tower may have the phone be sort of stuck in the new frequency, until the conditions of the previous RAT or frequency improve substantially, in order to prevent the phone being like a ping pong ball between the two. This frees resources that would otherwise be spent on repeated handover-related messages.
Each frequency has its own signal strength (free space path loss, transmit power, one frequency might be on one tower and another might be on another, etc).
I guess the bars aren’t realtime but updates every x seconds? I summed no malice.
Android is quiet lazy searching for towers.
That might be the worst app I’ve used on my iPhone in a year. Better off vibe coding an app to give you signal strength.
I don't need to install an app on my Android phone to see my network signal strength. It's kinda hidden though.
Settings->About Phone->Click the sim slot you want to see info for
As near as I can tell, the smallest subcarrier 5G can use is 15kHz, the thermal noise floor for a 15kHz channel at room temp (300K) would be -132 dBm.
My guess is whatever chip doing the measurement simply couldn't measure that low accurately, or it reports "nothing detected" as -140 dBm.
"Tests carried out by research group PolicyTracker, and shared with BBC's Morning Live, found that nearly 40% of the time a phone displays the 5G symbol, it is actually using a 4G connection"
Interestingly that company built a bridge of sorts allowing providers to get more life out of their older hard and software, converting e.g. 5G signals to 4G and 4G to 3G (where a signal is for example a phone phoning home telling the provider they used a megabyte of data, or looking up the IP address when calling a phone number)
Also where 2/3/4G network signals were all their own protocols (RADIUS and DIAMETER), 5G is just HTTP. And where for the 3G/4G stuff they had to write their own code to handle the protocols, for the 5G stuff they just used the cURL library. That is, cURL powers 5G networks.
Human brains: wow, what a bunch of suckers. Damn.
By the way, is it legal to be deceptive in this way?
I do.
I'm from Germany, land of perpetual EDGEing. Highest total GDP in the EU but can't build a mobile network for the life of it.
Then again we somehow forgot how to run trains and build cars without cheating, so I guess it fits.
Want to see a single bar? Come visit, our carriers aren't on the list with that inflate flag enabled. I guess they didn't get the same memo as the car manufacturers ;D
> Then again we somehow forgot how to run trains
The mobile networks don't have enough dB and the trains have too much DB?
https://en.wikipedia.org/wiki/Afsluitdijk
Yet, when we visit family in Germany, five minutes after crossing the border we are in a cellular dead zone.
GDP per capita (or GDP per square metre) would be a more useful indication here. Otherwise, you could throw a bunch of poor countries together--just for purposes of statistics, and expect a better mobile network?
I work with cellular BDA-DAS[1] gear sometimes, and I don't recall the last time I looked at the signal strength display on my phone. It has probably been years.
For me: It either works, or it doesn't work. It is either fast-enough, or impossibly-slow. It's very binary, and the bar graph at the top never told me a damned thing about what I should expect.
[1]: Bi-Directional Amplifier, Distributed Antenna System. In theory, such constructs can make indoor cellular coverage quite good inside of buildings that previously had none. In reality it can be... complicated. And while the bar graph doesn't mean anything, I still need ways to see what's happening as I spend hours, days, or [sometimes!] weeks surveying and troubleshoot and stuff. The phone can report things like RSRP, RSRQ, and some other tasty bits instead of just a useless graph -- and from there, I can sometimes make a hand-waving guess as to what I may reasonably expect for performance.
But that stuff is normally pretty well hidden from view.
A few months ago, I was in a remote area at anchor on a sailboat, about 6.5 miles from the nearest highway through the swamp, with only a few farms and a handful of houses within that radius. With my phone up in the cockpit of the boat and tethered over WiFi to my laptop, I was able to download a movie. As the boat swung on anchor, the download was occasionally interrupted, but when data was flowing it was consistently 5-10 MB/s over a claimed 5G link; the movie downloaded in much less time than its runtime. I assume I wasn't competing with much other traffic on that tower, wherever it was. So my experience was even more binary than yours.
The phone's signal indicator did seem to accurately indicate when it had no usable signal at all, but beyond that I'm not sure it was providing any useful information. And I'm not sure if it could have told me anything of use other than "connected" or "not connected". The very marginal connection was still faster than I had any right to expect for those conditions.
The net is telling me this is because of the aisle after aisle of tall metal shelving and the building itself also has a lot of metal in the construction.
It is quite annoying when you are trying to use the Home Depot app to look up something.
They finally added WiFi a year ago or so.
I hated having to walk near the doors to send a “was it this” question to the wife.
It must actually be tricky to space out towers that sparsely without creating any obvious coverage gaps, but if anyone is up to the task, it's certainly Vodafone (let's not talk about the actual service quality, though).
Wifi-calling to the rescue :)
But one bar is death for Internet - though HN will often load; anything heavier won’t.
Phone calls are hit-and miss without WiFi calling switched on.
Looking at satellite view it is clear that the local municipal water tower is between a huge cell tower near the highway and my house. All carriers seem to lease that same tower only in my area.
It wasn’t like this when I moved in but I guess the carriers consolidated on that big tower near the highway about two years ago.
Radio shadows are a thing I guess.
Deleted Comment
I don't know if I want my name on an open source project attached to a commit whose only purpose is to lie?
That's not something I was expecting to hear
But then of course if you can push a customer one way or the other it will be to the higher margin product.
(Probably a way to do it on Android, too)
A CSR showed me this while debugging network connectivity issues with my phone.
Like what Apple does with stopwatch.
https://lukashermann.dev/writing/why-the-iphone-timer-displa...
The countdown in question doesn't display fractions of a second so it would immediately switch from "5 seconds left" to "4 seconds left" which just doesn't feel right. Adding 0.5s solved the issue.
If you're counting up, round down. If you're counting down, round up. A human expects the count to finish at precisely the moment we get to the last number in the sequence (zero, for counting down). Do a count in your head to see what I mean.
Apple chose a compromise by rounding to nearest, for it to "feel good", but you lose the ability to exactly predict when the timer ends as a human. Typical Apple.
This signal strength is straight up lying about the actual signal strength